Python can waste 90% or 99% of your ops budget. But auto-scaling starts from 99%, and can go all the way to 4 or 5 9s of wastage. Of course, if you combine them, you add up the 9s.
Anyway, it's patently obvious that increasing your unit costs by 7 orders of magnitude will constrain the kinds of business you can run. A few people will deny that, but the cloud is so impactful that those are very few nowadays.
What a lot of people will deny is the relative importance of those two. If you take your C code from an auto-scaling environment and replace it with bare metal Python, you often get a few orders of magnitude gain.
Anyway, it's patently obvious that increasing your unit costs by 7 orders of magnitude will constrain the kinds of business you can run. A few people will deny that, but the cloud is so impactful that those are very few nowadays.
What a lot of people will deny is the relative importance of those two. If you take your C code from an auto-scaling environment and replace it with bare metal Python, you often get a few orders of magnitude gain.