Thanks to those who chimed in for my test question in a previous post (which you will need to review if you want to understand the present post).
So, people agreed with me that in the question I reproduced, the “correct” answer was to have each type of firm maximize profit, meaning the actual amount of dollars.
Then, to answer the second part of the question, you figure out how many firms go into wheat vs. corn production by seeing what number n makes wheat producers earn (roughly) $2 profit just like the corn producers do. (It’s not exactly $2 profit for the wheat producers, but if one more corn producer jumped in then the profit of wheat producers would be lower than $2.)
Now, here’s what’s weird: At the “equilibrium” n, you’ve got (as we said) corn and wheat producers earning (about) $2 profit each. But, they don’t hire the same number of laborers in order to produce that outcome. So that means one type of producer earns a much different *rate* of return on invested capital than the other. So how is this an equilibrium outcome?
(NOTE: I think I know the answer, but I’m just explaining the apparent problem in the way we economists typically solve one of these problems. We have the firm maximize the absolute dollars of profit, when we know in real life investors would shift into an industry where the *rate* of profit is highest. Nobody cares about the dollar-amount of profit, irrespective of how much you need to spend upfront on factor inputs. So, to repeat, I think I know how to resolve this apparent tension, but I will wait to see what others think before I chime in.)