The feds managed things for almost 50 years where they put everything out because the folks who made the policy viewed it as loss of possible valuable lumber. USFS didn't break with Silcox's 10 a.m. rule until the late '70's and even then, that was over 40 years of staunch enforcement so it took until into the late '90's before they started to really change how they manage.
Part of it is that fire is... well... fire. It doesn't play. It doesn't like to be directed or controlled. Letting something blow up too big tends to have issues that an agency can't stop it from destroying property, homes, etc... when it gets big enough. Thinning strategies can help. Prescribed burning can help. Allowing practicality can help (like, if something needs to be burned, let qualified people burn it, don't get in the way which is one thing that I get angry at usually old hippy/flowerchildren types who push so much of the conservation movements left in the US because they very much do obstructionist stuff).
Ecologically we know that all forests burn. What is the timeline for when they burn normally? Is it annual, semi-annual? Every five, 10, 20 years? 50? Every century (or longer)? And how have we, through how we've managed things over the last 120-ish years put stressors on that cycle. Stand replacing fires (big fuel loads, hot, dry, and windy conditions) can be part of a forest life cycle. Hell, there is debate in forestry about what constitutes a healthy forest and what constitutes the letting things burn vs. trying to manage vs. trying to always put things out.
(I was going for my wildland firefighting cert at one point recently. I have friends who do it for a couple different agencies and some who do it privately for land management. I have folks I know who have rotated into some wild fires in places around the US and Canada including this summer's Canadian conflagration.)