This is especially interesting because the typical argument against what I understand most people think they're referring to when they say "Hungarian notation" -- that is, prefixes that indicate type, such as lptcstr -- is that your compiler should handle the type-checking anyway. I think that is a very valid argument. But the above argument indicates that the point is to express the ideas outside of the types.
(I feel like this ought to be a rule of programming: "whenever you use a convention to indicate a rule, your programming language is failing you". I think it's cute that in Ruby, variables in CAPS are automatically const, because that's what you meant by caps anyway.)
The ML perspective -- to an extent -- is to get rid of type coercions, then declare colors and screen heights as different types altogether. Even if they're both ints underneath, the point of having a type system is to have the compiler tell you when you're comparing apples and oranges. Pointers and ints are* the same things underneath, too!
(Note that this is different than wrapping colors and pixels into objects; the types erase after you compile, so the actual produced code still operates as if you had just declared them as "int", without any overhead of indirection into an "object".)
But this only works to an extent. I recall reading something (a web page? a paper?) talking about the limits of using type systems in this way. One of the ideal consequences is that you'd eventually be able to express, for example, meters, seconds, etc. as separate types and then combine them to be able to easily verify that the units of your physics equations type-check. Unfortunately, as I recall you run into problems expressing ideas like "meters per second" or "square meters" as combinations of the more primitive types.
* (Ok, not sometimes, but you get the idea.)