Isn't x/X unsigned?
According to all sources I can find, x and X interpret their argument as an unsigned int, not as a signed int. How come this page says it's signed? Jasbad 05:57, 28 August 2013 (PDT)
- Yes, you're right, that's an error. Thanks for spotting it. --P12 06:15, 28 August 2013 (PDT)
 The 'o' type field should also specify an unsigned integer.
Quote from C99 (188.8.131.52): "o,u,x,X The unsigned int argument is converted to unsigned octal (o), unsigned decimal (u), or unsigned hexadecimal notation (x or X) in the style dddd; the letters abcdef are used for x conversion and the letters ABCDEF for X conversion. The precision specifies the minimum number of digits to appear; if the value being converted can be represented in fewer digits, it is expanded with leading zeros. The default precision is 1. The result of converting a zero value with a precision of zero is no characters."