Recent national studies show that denominational Christianity in the United States continues to decline, with fewer Americans identifying with long-established church bodies. While Christianity remains the majority faith tradition, surveys indicate that the number of adults affiliating with a specific denomination has steadily dropped over the past decade.







