After a few months of OS devving, I've gotten more used to reading numbers in hex than in decimal. In my debugging output, I always have used the format 0x1234ABCD (with leading '0x' and capital letters), and sometimes just 1234ABCD. I use all caps mostly because it looks more consistent; more like a number. But I've seen many instances where lower case is used (like SHA1 in git), as well as different conventions for denoting that it's hexadecimal.
What is the general consensus on what display looks more aesthetic?
Are there any other forms that are used?
And what about outputting in little-endian format instead of big-endian, for easy cooperation with real binary?
e.g.
0x12345678
becomes
0x87654321
and you count (by 5s, for brevity):
0x0
0x5
0xA
0xF
0x11
0x61
0xB1
etc.
I'm just assuming that this is a minor enough thing not to start a flamewar, but I've been wrong before.
