Even if this paper is correct, the approach is not practical. From table 1 in section 6.2, the LP representation of a 25-city TSP requires 24,580,032 variables and 3,643,825 constraints. Merely formulating that model will require significantly more time than fully optimizing a TSP of that size using an assignment or 2-matching relaxation and adding subtour elimination constraints as they are violated. The former will likely take seconds (or maybe minutes) at that size, while the latter can be accomplished in milliseconds.
That table of values looks like it's growing exponentially. By the time they reach 70 cities, they will be into a trillion variables, which is not practical.
In real world usage, the TSP gets into the thousands of "cities." This would be for things like integrated circuit layouts.
"exponentially" is not something you can just colloquially throw about in this sort of discussion. The numbers are bounded by n^6, and the whole point of the table is that it seems like it's growing at even less than that.