26

I have a string that looks like this "090A0B0C" and I would like to convert it to a slice that looks something like this [9, 10, 11, 12]. How would I best go about doing that?

I don't want to convert a single hex char tuple to a single integer value. I want to convert a string consisting of multiple hex char tuples to a slice of multiple integer values.

Philippe
  • 1,715
  • 4
  • 25
  • 49
  • 4
    1. We do expect some effort to solve the problem on your own. 2. I don't think you would want to obtain a _slice_, since that one will not own the content. – E_net4 Oct 25 '18 at 10:45
  • Possible duplicate of [Converting a hexadecimal string to a decimal integer](https://stackoverflow.com/questions/32381414/converting-a-hexadecimal-string-to-a-decimal-integer) – Stargateur Nov 02 '18 at 15:22
  • @Stargateur, the part that overlaps with my question was edited in after I had asked my question. – Philippe Nov 02 '18 at 15:32

2 Answers2

43

You can also implement hex encoding and decoding yourself, in case you want to avoid the dependency on the hex crate:

use std::{fmt::Write, num::ParseIntError};

pub fn decode_hex(s: &str) -> Result<Vec<u8>, ParseIntError> {
    (0..s.len())
        .step_by(2)
        .map(|i| u8::from_str_radix(&s[i..i + 2], 16))
        .collect()
}

pub fn encode_hex(bytes: &[u8]) -> String {
    let mut s = String::with_capacity(bytes.len() * 2);
    for &b in bytes {
        write!(&mut s, "{:02x}", b).unwrap();
    }
    s
}

Note that the decode_hex() function panics if the string length is odd. I've made a version with better error handling and an optimised encoder available on the playground.

Sven Marnach
  • 574,206
  • 118
  • 941
  • 841
  • your implementtion would be cooler if you wouldn't rely on `fmt` crate. I am compiging with `no_std` flag and I can't use any `std`-based crates – Nulik May 17 '19 at 21:21
  • @Nulik But shouldn't you still be able to use the `core` library even with `no_std`? – Sven Marnach May 18 '19 at 19:02
  • @Seven yep! found out about it this morning. – Nulik May 18 '19 at 19:55
  • You solution is subtle but more versatile than `hex` one. For instance, I was trying to convert the bytes generated by `ed25519_dalek` signature into string, what wasn't possible as it was an 64-bytes long array. By receiving a reference, you shrewdly solve the issue. – Miere Jul 01 '20 at 02:17
  • Btw, wouldn't be interesting if you put it into a crate relieving us the burden of copying and pasting your improved version whenever needed? – Miere Jul 01 '20 at 02:18
  • 2
    @Miere The main point of this answer was to provide two simple functions that can be used if you don't want to use the `hex` crate for some reason, e.g. to reduce compile times. I don't quite know why I wrote the version on the playground. Does my implementation have any advantage over the `hex` crate? If so, I'm happy to put it in a new crate. – Sven Marnach Jul 01 '20 at 12:47
  • I'll benchmark it. Once I get the results I'll come back to you, @SvenMarnach – Miere Jul 02 '20 at 13:13
  • `encode_hex` has an unhandled `Result` from the `write!` macro. – Herohtar Apr 26 '21 at 06:30
  • 1
    @Herohtar Writing to a string always returns `Ok(())`. I added `.unwrap()` to make that explicit. – Sven Marnach Apr 26 '21 at 07:10
  • The step_by method would have saved me a lot of hassle if I had known to use it! Thanks! – Brian Kung Jan 12 '22 at 14:41
34

You could use the hex crate for that. The decode function looks like it does what you want:

fn main() {
    let input = "090A0B0C";

    let decoded = hex::decode(input).expect("Decoding failed");

    println!("{:?}", decoded);
}

The above will print [9, 10, 11, 12]. Note that decode returns a heap allocated Vec<u8>, if you want to decode into an array you'd want to use the decode_to_slice function

fn main() {
    let input = "090A0B0C";

    let mut decoded = [0; 4];
    hex::decode_to_slice(input, &mut decoded).expect("Decoding failed");

    println!("{:?}", decoded);
}

or the FromHex trait:

use hex::FromHex;

fn main() {
    let input = "090A0B0C";

    let decoded = <[u8; 4]>::from_hex(input).expect("Decoding failed");

    println!("{:?}", decoded);
}
rnstlr
  • 1,627
  • 14
  • 17
  • Thank you for this answer, just fixed a bug implementing this – Corfucinas May 03 '22 at 09:17
  • 1
    If I could I would give you another up vote. I was trying to do this without realizing that I already had the `hex` crate. I saw your answer a while ago and today it helped me out again! – Nico Serrano Oct 31 '22 at 22:21