1

Given a &[char] argument, that I'm assuming is one ore more decimals in hexadecimal, I am trying to call from_str_radix to convert it to bytes. However, from_str_radix expects &str not &char. Is there a way to convert each &char into &str?

fn convertCharsToBytes(args: &[char]) -> std::vec::Vec<u8> {
    let mut bytes:Vec<u8> = Vec::new();
    for arg in args {
        let byte = u8::from_str_radix(arg, 16); //arg here is invalid as it is a &char, not a &str
        match byte {
            Ok(value) => bytes.push(value),
            Err(error) => {}
        }
    }
    println!("{:?}", bytes);
    return bytes;
}
Glen Pierce
  • 4,401
  • 5
  • 31
  • 50
  • [The duplicate applied to your situation](https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=32f8e026aa17d800f718734a9a7dace2) – Shepmaster Oct 20 '20 at 13:25
  • 2
    https://www.reddit.com/r/rust/comments/jb3ukm/we_need_to_talk_about_stackoverflow/ – Robert McPythons Oct 20 '20 at 13:48
  • How is https://stackoverflow.com/questions/47629596/converting-a-char-to-str a duplicate? &char is not char. The proposed solutions involve using encode_utf8 which none of the answers below even considered. https://stackoverflow.com/questions/43983414/how-to-convert-a-rust-char-to-an-integer-so-that-1-becomes-1 this only applies to integers, I'm dealing with values a-f as well. – Glen Pierce Oct 22 '20 at 05:50

3 Answers3

1

I think this will work as expected

fn convertCharsToBytes(args: &[char]) -> std::vec::Vec<u8> {
    let mut bytes:Vec<u8> = Vec::new();
    for arg in args {
        let byte = u8::from_str_radix(&arg.to_string(), 16);
        match byte {
            Ok(value) => bytes.push(value),
            Err(error) => ()
        }
    }
    println!("{:?}", bytes);
    return bytes;
}

&arg.to_string() is a slice of the generated string which consists of one char.

asmmo
  • 6,922
  • 1
  • 11
  • 25
0

Slightly faster:

fn convertCharsToBytes(args: &[char]) -> std::vec::Vec<u8> {
    // E.g. 17 is encoded as 0x11 but 10 as 0x0A
    // So bytes are always encoded using 2 symbols
    assert_eq!(args.len()%2, 0, "Hexes length must be even!");
    let mut bytes:Vec<u8> = Vec::with_capacity(args.len()/2);
    // To reuse allocation instead calling to_string on every iteration
    let mut string = String::new();
    for &num in args.chunks_exact(2) {
        // To eliminate bounds checks in indexing
        // Maybe in future chunks_exact would return &[T;2] instead of &[T]
        // but it is inaccessible before const generics
        assert_eq!(num.len()==2);
        string.clear();
        string.push(num[0]);
        string.push(num[1]);
        let byte = u8::from_str_radix(&string, 16);
        match byte {
            Ok(value) => bytes.push(value),
            Err(_) => panic!("Invalid hex {}!", &string),
        }
    }
    println!("{:?}", bytes);
    return bytes;
}

Or, if you do not care O(n) extra memory usage and pulling dependency:

fn convertCharsToBytes(args: &[char]) -> std::vec::Vec<u8> {
    let s: String = args.iter().collect();
    let bytes = hex::decode(&s).expect("Failed to parse hex");
    println!("{:?}", bytes);
    return bytes;
}

UPD: Fixed error in first code snipped (I forgot, that one byte is encoded by 1 char). This kind of errors is the reason why you should use dependency instead of coding own solution since crate is tested much better.

0

You can avoid extra allocations by doing the conversion yourself:

let byte = if char > 'a' { char as u32 - 'a' as u32 }
           else if char > 'A' { char as u32 - 'A' as u32 }
           else { char as u32 - '0' as u32 } as u8;

Plus error checking if you can't guarantee that char is a valid hexadecimal digit.

Note however that this (as well as your original code) will convert each char individually. So if you call it with "1a", then bytes will contain [0x01, 0x0a] and not [0x1a].

Jmb
  • 18,893
  • 2
  • 28
  • 55