• 0

[C++] char to u_char


Question

Hello,

I'm have an array of char and I want to convert this to a u_char but I've tried several techniques and none of them work. I've tried reinterpret_cast<unsigned char>(variable), I've tried (unsigned char)variable.

Anyone can help?

Link to comment
Share on other sites

11 answers to this question

Recommended Posts

  • 0

What do you mean by none of them work? What is happening? I've not done much type casting in C++ but the few threads I found online seemed to use reinterpret_cast.

Link to comment
Share on other sites

  • 0

Yeah sorry wasn't clear enough,

What I want to do more specifically is get the value in the CHAR variable and put it in a U_CHAR variable.

When I use type casting the hex value wasn't it was before, ie it changes. I don't want the value to change.

Any ideas?

Link to comment
Share on other sites

  • 0

for example, if I cout the CHAR variable it will show AE

if I print the U_CHAR value (type casted) it will show L (don't remember the exact value)

Link to comment
Share on other sites

  • 0

That is normal behaviour though. It is just re-interpreting the data inside the char as unsigned (flip all bits, add 1 or something) and treating it like a char value.

Link to comment
Share on other sites

  • 0

Yes I realised that as I was writing my previous reply, so my question now is how can I extract the same data and place it in a u_char variable?

Link to comment
Share on other sites

  • 0
Yes I realised that as I was writing my previous reply, so my question now is how can I extract the same data and place it in a u_char variable?

You cannot. They are different types for a reason. An unsigned variable can (on a two's complement implementation, pretty much all platforms) hold a value between 0 and 2^n-1 where n is the number of bits in the type. If n=8, that is 0 to 255. A signed variable on the other hand, can hold a value between -2^(n-1) and 2^(n-1)-1. In other words, if n=8, from -128 to 127. This means that you can only safely convert between the two if the value falls within the region that is the same for both (0 to 127 with n=8).

It's also worth pointing out that the signedness of the "char" type is implementation-specific and can be either signed or unsigned. It can also be any number of bits (but always one byte).

Edited by hdood
Link to comment
Share on other sites

  • 0

Of course you can convert between them. But you can't cast between them and magically have the data converted into some other form. A cast basically says to the compiler "I know the data here isn't really a char, but you can take the raw value and treat it as one, I've made sure they're the same." A cast doesn't do any conversion or anything.

If you want to convert between them, you're going to have to define what you mean by "conversion." If all the values in the char value are positive, then the conversion is easy. For instance, if you're working with 7-bit ASCII then the high order bit doesn't really matter. But it sounds like perhaps that isn't the case.

So... why are you using u_char anyway?

Link to comment
Share on other sites

  • 0

I've got to use u_char for a predefined function. And I got char from using sprintf to convert a value into Hex

I've just tried something like this

char test[3];

u_char test2;

sprintf(test, "%X", 174);

cout<<test<<endl;

test2 = reinterpret_cast<unsigned char>(test);

cout<<test2<<endl;

test becomes AE

test2 becomes ' (decimal 96)

Shouldn't that not happen because the value was positive before casting?

Edited by Sir Rugmuncher
Link to comment
Share on other sites

  • 0

sprintf doesn't convert to "hex," it creates strings. You are trying to cast THREE chars (containing the string "AE") into a single unsigned char. That is not physically possible.

sscanf on the other hand, can be used to convert strings to other types: sscanf(test, "%x", &test2);

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.