Fixed Costs and Overwork
I was thinking recently about overwork and the tendency, particularly in the U.S., for employees to spend long hours at the office. The straightforward view would be that employers simply make employees work long hours here, perhaps due to greater employer leverage (e.g., weak labor markets, limited recourse, lack of unions, etc) which causes employees in the U.S. to have less power to resist overwork than they do in other developed countries.
It could also plausibly be cultural. What happens between two parties is a function of what those parties want and expect. If, in the U.S., employees and employers simply have higher expectations for the amount of hours employees must work, then employees simply may not fight as hard. Side note: I would love to see a comparison of U.S. and European salaried workers final hourly compensation. It's quite hard to compare living expenses, social benefits (like healthcare), etc., but I suspect mid-tier U.S. salaried workers take home similar amounts to their peers, but work much longer hours with fewer benefits, although potentially (but hardly assuredly) enjoying lower taxes.
Another explanation could be a relative shortage of skilled workers here, so that employers find the labor market to be very tight. It is the case that the U.S. labor market is typically much tighter than the European labor market, but you'd also expect that to be reflected in higher salaries. I don't have good data on that, but I suspect it is not the case. Moreover, the labor market is tight, but labor participation is low, which I suspect plays a major role in why inflation remains mild - so I actually don't think we have a particularly tight labor market.
But I think an important overlooked factor in overwork is fixed costs, in particular I am thinking about healthcare. In the U.S., bringing on a full time person at most businesses entails buying health insurance for them, which is a big fixed cost. To see how this may impact overwork, imagine that an employer needs, for example, forty more hours of work per week done. They could hire one more full time worker or they could ask ten people to work four more hours a week (or four people ten hours, etc). Hiring someone means paying for forty hours of work, plus health care. Splitting those forty hours of work among current employees means just paying for forty hours of work. If the employees are salaried, they are still getting paid an hourly rate, it's just their salary divided by their expected number of hours. Their boss could simply asked them to work more and pay them for those marginal more hours (reflected in their salary). Their boss could even pay them a little extra for that time (above their imputed hourly rate) by sharing some of the savings they are enjoying by not having to buy healthcare for a new employee.
One could argue that employees' hourly rate should reflect their healthcare benefits as well, so that if an employer asks employees to work longer hours, they would demand an hourly rate that reflected their (wages plus benefits)/hours rate. But I don't think that is how it actually works. Most people simply assume acceptable healthcare and focus on nominal salary (they may not even be aware of how much is being spent on their healthcare). Moreover, employees shouldn't charge that rate insofar as the marginal rate they are already selling their time is just their salary/hours rate. If, for example, your boss moved you from a five day a week to a four day a week schedule, most employees would object, quite strongly, if they get a decline in salary in excess of 20%. But if their healthcare stayed the same, a 20% reduction in hours should results in a greater than 20% reduction in salary. That doesn't usually happen though.
It is this dynamic, the larger fixed costs of healthcare in the U.S., that I think contributes to overwork in salaried environments. Interestingly, I think we can see a similar phenomenon in Europe as well with respect to restaurant servers. In the U.S., servers tips are protected, employers cannot take or claim them, and they constitute the primary income for most servers. U.S. servers' have a special minimum wage (at least in New York), well below actual minimum wage. That means that hiring a marginal server doesn't cost restaurants very much here because most of their income is provided by the customers and is unavailable to the owner.
In Europe, the vast majority of servers' income is from their wages. What do you see? A much lower staff to customer ratio (in my experience). European restaurants are understaffed relative to U.S. ones because owners actually pay for the service. In the U.S., by basically demanding everyone tip 20% and protecting those wages, we incentivize employers to hirer more, diluting the wages of servers (even though they are "protected"), whereas in Europe, restaurant employers hire less labor.
So which system is better? I would originally argue it's better without fixed costs, but in fact it isn't clear. Fixed costs can benefit current employees by encouraging management to look internally for solutions and by limiting labor competition, potentially increasing wages. But it also distorts employer incentives, potentially leading to overwork and reduced efficiency. As fixed costs grow very high, this distortionary effect likely because very inefficient.
Finally, virtually all hiring comes with fixed costs, specifically training and institutional knowledge. The greater the skill involved and the more specialized the role, the higher the fixed cost (usually). At the higher skill end of labor markets adding to that fixed costs is probably quite bad (resulting in overwork), but at the lower end it may be overall good for wages although it may result in less employment. In the U.S. we imposed additional high fixed costs for higher skilled labor (healthcare), but virtually nothing on the lower end of the skill spectrum. I suspect Europe doesn't impose similar extra fixed costs on higher skilled labor, but may have higher fixed costs (in terms of taxes and regulations) for lower skilled labor.
It could also plausibly be cultural. What happens between two parties is a function of what those parties want and expect. If, in the U.S., employees and employers simply have higher expectations for the amount of hours employees must work, then employees simply may not fight as hard. Side note: I would love to see a comparison of U.S. and European salaried workers final hourly compensation. It's quite hard to compare living expenses, social benefits (like healthcare), etc., but I suspect mid-tier U.S. salaried workers take home similar amounts to their peers, but work much longer hours with fewer benefits, although potentially (but hardly assuredly) enjoying lower taxes.
Another explanation could be a relative shortage of skilled workers here, so that employers find the labor market to be very tight. It is the case that the U.S. labor market is typically much tighter than the European labor market, but you'd also expect that to be reflected in higher salaries. I don't have good data on that, but I suspect it is not the case. Moreover, the labor market is tight, but labor participation is low, which I suspect plays a major role in why inflation remains mild - so I actually don't think we have a particularly tight labor market.
But I think an important overlooked factor in overwork is fixed costs, in particular I am thinking about healthcare. In the U.S., bringing on a full time person at most businesses entails buying health insurance for them, which is a big fixed cost. To see how this may impact overwork, imagine that an employer needs, for example, forty more hours of work per week done. They could hire one more full time worker or they could ask ten people to work four more hours a week (or four people ten hours, etc). Hiring someone means paying for forty hours of work, plus health care. Splitting those forty hours of work among current employees means just paying for forty hours of work. If the employees are salaried, they are still getting paid an hourly rate, it's just their salary divided by their expected number of hours. Their boss could simply asked them to work more and pay them for those marginal more hours (reflected in their salary). Their boss could even pay them a little extra for that time (above their imputed hourly rate) by sharing some of the savings they are enjoying by not having to buy healthcare for a new employee.
One could argue that employees' hourly rate should reflect their healthcare benefits as well, so that if an employer asks employees to work longer hours, they would demand an hourly rate that reflected their (wages plus benefits)/hours rate. But I don't think that is how it actually works. Most people simply assume acceptable healthcare and focus on nominal salary (they may not even be aware of how much is being spent on their healthcare). Moreover, employees shouldn't charge that rate insofar as the marginal rate they are already selling their time is just their salary/hours rate. If, for example, your boss moved you from a five day a week to a four day a week schedule, most employees would object, quite strongly, if they get a decline in salary in excess of 20%. But if their healthcare stayed the same, a 20% reduction in hours should results in a greater than 20% reduction in salary. That doesn't usually happen though.
It is this dynamic, the larger fixed costs of healthcare in the U.S., that I think contributes to overwork in salaried environments. Interestingly, I think we can see a similar phenomenon in Europe as well with respect to restaurant servers. In the U.S., servers tips are protected, employers cannot take or claim them, and they constitute the primary income for most servers. U.S. servers' have a special minimum wage (at least in New York), well below actual minimum wage. That means that hiring a marginal server doesn't cost restaurants very much here because most of their income is provided by the customers and is unavailable to the owner.
In Europe, the vast majority of servers' income is from their wages. What do you see? A much lower staff to customer ratio (in my experience). European restaurants are understaffed relative to U.S. ones because owners actually pay for the service. In the U.S., by basically demanding everyone tip 20% and protecting those wages, we incentivize employers to hirer more, diluting the wages of servers (even though they are "protected"), whereas in Europe, restaurant employers hire less labor.
So which system is better? I would originally argue it's better without fixed costs, but in fact it isn't clear. Fixed costs can benefit current employees by encouraging management to look internally for solutions and by limiting labor competition, potentially increasing wages. But it also distorts employer incentives, potentially leading to overwork and reduced efficiency. As fixed costs grow very high, this distortionary effect likely because very inefficient.
Finally, virtually all hiring comes with fixed costs, specifically training and institutional knowledge. The greater the skill involved and the more specialized the role, the higher the fixed cost (usually). At the higher skill end of labor markets adding to that fixed costs is probably quite bad (resulting in overwork), but at the lower end it may be overall good for wages although it may result in less employment. In the U.S. we imposed additional high fixed costs for higher skilled labor (healthcare), but virtually nothing on the lower end of the skill spectrum. I suspect Europe doesn't impose similar extra fixed costs on higher skilled labor, but may have higher fixed costs (in terms of taxes and regulations) for lower skilled labor.
Comments
Post a Comment