Posted inUncategorized

Americans are taking more control over their work lives—because they have to

One thing that’s become clear in the past few tumultuous—and for many, traumatic—years is that it’s easy to feel like there is no control in our lives. Control is a basic psychological need that helps people feel like they have agency, from how they live to where they work. One area where people have tried to wrestle back control is around work.