IIRC the property ‘each single digit has the same density’ is the definition for a ‘simply normal number’ (in a given base), while ‘each finite string of a particular length has the same density as all other strings of that length’ is the definition for a ‘normal number’ (in a given base). And then ‘normal in all bases’ is sometimes called ‘absolutely normal’, or just ‘normal’ without reference to a base.