I am not convinced of the common 10X programmer theories. My alternate theory: A lot of tiny specific skills are necessary for optimal performance and you probably do not have them all. In consequence, there is no general 10X programmer and there is no good way to predict performance. However, you can actively train to achieve top performance more often.
The "10X Programmer" by definition is more elite than most programmers. He is a rockstar, who produces 10X faster or better or cheaper than the worst (not the average!) programmer. He fixes bugs faster by an order of magnitude. The theory is usually extrapolated that this is an innate talent and without the talent, you will never achieve this level. It is actually this common extrapolation I do not agree with, since differences in performance are just an observation of a phenomenon.
Personally, I believe this talent is a myth. There is evidence for 10X differences. However, I do not know about evidence of the innate talent theory.
So one developer can be 10 times as efficient as another one, but it might depend on a random obscure skill. For another task, the other one might have the critical skill instead. Usually, we cannot predict which skills result in optimal performance. Only if you already know the reason for a bug, you know if the memcheck skill would be helpful. From a bug report as such, someone, who does not have the skill, cannot deduce to ask someone with this skill for help. If you have the memcheck skill, you can probably predict if the skill would be helpful. However, even with the skill it will be hard or impossible to predict the actual speedup. Using gdb instead of memcheck, will it take 10% or 10X longer in a specific case?
A corollary is that even an acknowledged elite programmer, like John Carmack or Fabrice Bellard, will be crappy if his skills do not match the problem. This is a testable hypothesis and in my opinion there is lots of anecdotal evidence for this. For example, if elite programmer Carmack recommends more static analysis, it means he learned a new skill, applied it successfully, and believes it will pay off for others as well. Of course, there are many more situations, where this specific skill does not help at all. However, software engineering as a field tries to find the most beneficial skills. For example, version control is practically a required skill these days, because it is helpful in so many cases.
Another corollary is that a broad skill set of easily acquired skills should be more efficient than fewer expensive skills. There are skills, like "for loops", which you can apply much more often than others, like "difference between charAt and index". This is also a testable hypothesis. For example, I would expect programmers who learn lots of different programming languages to be better programmers compared to single-language programmers. Of course, the problem remains to define "better programmer". However, this means you can actively train to become a rockstar programmer by focusing on quantity and coverage of skills. Also, being in a team of great programmers means you learn about their often used skills and which are worth learning as well.
Yet another corollary is that job distribution matters. Maybe someone on your team has a better skill match, but it is unpredictable to find the best one. I am not sure how this could be tested. Maybe we can count the success of pair programming (is it?) as evidence, since the skill set of two programmers should often match better than one. It also means that you should probably ask around your colleagues, if you get stuck on a problem and want to try new approaches.
Experience in general means more accumulated skills. So my theory also explains how experience improves productivity. However, it also means if you switch to another field (e.g. from web to embedded) you could lose your edge.
This is my opinion and since I do not usually read it in discussions, I summarized it here. Critique welcome!