A combination of factors. Catering to mobile and tablets, and aiming at low-IQ users are the main reasons.
In the early days of the web and of computing in general, most users were from universities or had been in tech already for a long while. Having to think for a few seconds to understand an interface was not a burden if it delivered more power in the end.
Alas most people simply cannot think for a few seconds, and it is at these minimally-competent masses that most technology is now squarely aimed.
Removal of features that would possibly confuse anyone — even a recent brain trauma victim — is the modus operandi these days.
Those of us who lived through a period when software really improved with each release (roughly 1980-2009, with some regressions along the way) find it nearly unbelievable how much dread is now involved in each “upgrade.”
The only benefit is that if people like us can manage to find something more like older software, we can be vastly more productive in the workplace and make those who can only type on a phone screen and only comprehend an interface built for preschool children look very bad by comparison.