Western countries like Norway, Denmark, the Netherlands, and Finland are all doing just fine without active religiosity being a major part of their culture. The current problems afflicting the US obviously have a different cause than the decline of Christianity, and it's pretty obvious what those causes are.
Societies depend upon the existence of an intersubjective consensus. If people are able to put aside their differences and work together for the greater good of themselves and their society, then society is able to function. If divisions become so great that this is no longer possible, then society breaks down. The US is clearly falling into the latter category, and there are many reasons for this: urban ghettoization, unresolved racial tension, deindustrialization and it's resulting structural unemployment/poverty, the broken windows approach to crime and education, the growing economic divide in America, etc.
Christianity could experience a huge resurgence in the United States, and it would do absolutely nothing to address the problems which led to the current situation. If anything, it would merely placate people; turning their focus inwards while their society is burning around them.
I don't know if it's that dramatic, but it seems undeniable at this point that the United States is in serious decline, both relatively, and now, possibly in absolute terms as well. People are quick to blame Trump, and he is definitely a catalyst, but really, the ultimate blame should fall upon the entrenched ineffectiveness of the political system that surrounds him. Without that, Trump would never have been elected.
The real question is whether or not the political will exists in America for the system to fundamentally change, and unfortunately, this appears doubtful. I think America is strong enough to avoid a collapse, but I fear it might be optimistic to expect much more than a managed decline.
Christianity has left it's mark on Europe, but to pretend that it still exerts an active influence in Western society outside the United States and Eastern Europe is completely mistaken. I live in Western Europe, and I can tell you that Christianity has practically no relevance here anymore. For 99% of the population, church attendance is limited to weddings and funerals, and Cathedrals are practically museums at this point.
You could equally say that Greek and Roman culture has had a lasting impact on Western society (which it has), but I think you'd struggle to find anyone today who believes in Zeus.