When you change the type of the variable, it alerts the compiler as to what type of value should be interpreted from the 1s and 0s in the relevant area in memory when you make use of the variable from standard code.
When you supply a value to NSLog() using its format string, the behavior is a little different due to various internal details. By using a format specifier of %i, you’re asking for the 1s and 0s in memory to be interpreted as a signed integer. Use a specifier of %u and you’re asking for the same 1s and 0s to be interpreted as an _un_signed integer. It doesn’t actually matter what type of variable you originally declared as NSLog doesn’t get to see that type, it only gets to see the format specifier token, and the contents in memory that you’ve asked to be shown.
So, if you declare an unsigned int, then assign it a negative number value, then view that memory as if it were a standard int using %i, you’ll see the negative number.
If you change the %i to %u for display of a negative number, you won’t get the value you expect, you’ll see the value that results from the combination of 1s and 0s you set for a signed number but interpreted as if they were for an unsigned number. For added amusement, try declaring a float variable (floating point i.e., number with a decimal like 3.14) and viewing it as if it were an integer with the %i token.
This would probably all be less confusing if the compiler warned you when you assign a negative number to an unsigned variable, but on standard settings, it doesn’t. You can enable a warning by setting a compiler flag of -Wconversion. This will trigger a warning of “Negative integer implicitly converted to unsigned type” for your example, but the flag will likely cause a whole bunch of other warnings to appear for code in the rest of the project (turning this on for the project I currently have open in Xcode generates 417 warnings…!).