• 0

[C++] char to u_char


Question

11 answers to this question

Recommended Posts

  • 0

Yeah sorry wasn't clear enough,

What I want to do more specifically is get the value in the CHAR variable and put it in a U_CHAR variable.

When I use type casting the hex value wasn't it was before, ie it changes. I don't want the value to change.

Any ideas?

  • 0
  Sir Rugmuncher said:
Yes I realised that as I was writing my previous reply, so my question now is how can I extract the same data and place it in a u_char variable?

You cannot. They are different types for a reason. An unsigned variable can (on a two's complement implementation, pretty much all platforms) hold a value between 0 and 2^n-1 where n is the number of bits in the type. If n=8, that is 0 to 255. A signed variable on the other hand, can hold a value between -2^(n-1) and 2^(n-1)-1. In other words, if n=8, from -128 to 127. This means that you can only safely convert between the two if the value falls within the region that is the same for both (0 to 127 with n=8).

It's also worth pointing out that the signedness of the "char" type is implementation-specific and can be either signed or unsigned. It can also be any number of bits (but always one byte).

Edited by hdood
  • 0

Of course you can convert between them. But you can't cast between them and magically have the data converted into some other form. A cast basically says to the compiler "I know the data here isn't really a char, but you can take the raw value and treat it as one, I've made sure they're the same." A cast doesn't do any conversion or anything.

If you want to convert between them, you're going to have to define what you mean by "conversion." If all the values in the char value are positive, then the conversion is easy. For instance, if you're working with 7-bit ASCII then the high order bit doesn't really matter. But it sounds like perhaps that isn't the case.

So... why are you using u_char anyway?

  • 0

I've got to use u_char for a predefined function. And I got char from using sprintf to convert a value into Hex

I've just tried something like this

char test[3];

u_char test2;

sprintf(test, "%X", 174);

cout<<test<<endl;

test2 = reinterpret_cast<unsigned char>(test);

cout<<test2<<endl;

test becomes AE

test2 becomes ' (decimal 96)

Shouldn't that not happen because the value was positive before casting?

Edited by Sir Rugmuncher
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.