Many people have asked why this way. I can't argue that there are better methods. But there are very specific reasons why not to.
Here are the reasons:
1. Technical Limits for new tech. I can't use morph targets as the addon Gfur is limited with morph target support. Gfur is an addon that gives me realtime fur, so I can spit out a high quality character with fur without the render cost for quick production turn around, viral videos, and weekly animations. The end goal is production cost which requires a bit more pre-production which pays in the long run.
2. Third-Party compatibility. I have to export everything to Unreal Engine 4, meaning I can only export what is compatible with FBX. Meaning, many constraints don't export, lattices don't export, scale exports incorrectly due to UE only being "Local " with scale, etc. I use a shadow rig constrained to a deformer rig to work around that for that. UE has a 150 deformer bone limit with 8 vertex per influence.. so having smaller bones influencing areas that incluence other bones works around that. Less bones on the same vertex, same flexibility. This also is cheap on performance, both in Blender and UE.
3. Non-Destructive Flexibility. I need the same flexibility as a lattice, morph shapes and fine animation "drawing" control that exports nicely and is non-destructure - BUT without using all those features to keep it compatible with UE and the Gfur addon. If I edit the mesh, the shapes and poses would be destroyed and I'll need to rebuild hundreds of shapes again - with blendmorphs. This is expensive and time consuming if I want to rebuild the rig on another character or do any changes to the mesh. With this rig, any pose and shape I build with the flexible rig will be non-destructive and have potential to transfer (with animation libraries) to other characters and allow me to change the mesh. I'm building once for hopefully many characters and non-destructively.
4. Performance.. morphs are baked in stone and require heavy vertex evaluations all the time, plus drivers to drive them all which are harder to program (requires some code) and harder to setup (specially in Blender, and for the same control and effect I'd need hundreds to setup and then minimalize to a few user friendly controls with fancy code hacks), and creating drivers to make them all really hogs the performance of Blender. I could get 30fps out of this rig with opendubdiv and very few drivers - but with the same morph based drive rig I'd drop to 15-20 fps. I did this once in an easier way through nodal programming using Animation nodes, but that still was a performance hog, up to 10-15fps. Believe it or not, creating hundreds of drivers onto hundreds of shapes is about the same or more time to do than this rig setup.
So there you go.. This is experimental, yes. Not a lot of people do it. It's not conventional. But it's built with purpose with a lot of justifiable reasons.
I want fur. I want realtime. I want Unreal Engine. I want non-destructive animation controls. I want cartoon squash and stretch, and I want it animated quickly and exporting quickly to render quickly. Because I want weekly animations. And I want many characters with similar rigs, to have my animators not have to re-learn controls and have animation cross compatible and export to a high quality realtime render engine.
And.. I didn't want to auto-build the rig because I want to understand how to make it work properly so if an auto-rig breaks, I know how to fix it.
This is why I chose not to use blendmorphs/shapes and make this bone rig work.
Hope that makes sense! And yes, it is a crazy ambition, but so far quite rewarding.