I don't think I've ever seen a paper that included much raw data. It's possible this may be a good suggestion, but it's not the way things are currently done.
Many journals (including a lot of the nature ones) are requiring data and code on publication now
Part of the problem is that as far as I can tell it’s on the reviewers to flag noncompliance, and a lot of times groups won’t actually publish code and datasets before the paper is accepted and assigned a DOI.
So really it’s a culture change around the whole publishing process that’s needed, IMO
Maybe not typically, but room temp superconductivity is a nobel prize level claim, and it seems that the experiment is not fully specified without the raw data.
The scientific community generally does a pretty good job of shooting down big claims like this that get published (see: Pons and Fleischmann), so what's the huge benefit? On top of this, if it's to have any value, the reviewers of the article would need to recreate whatever code was used to process the raw data OR audit the existing code to make sure it made sense, both of which are pretty heavy duty tasks. Reviewers don't get paid, so this is a good way to ensure no one wants to review articles for your journal.
If there is substantial interest in the paper after it has been published, it would be a lot easier to verify with the raw data than without. Nothing about the peer review process necessarily needs change beyond immediately rejecting all papers that do not include the minimum amount of data to validate their claims (even if said data is simply stored and not immediately reviewed).