0

I am using CryptoSwift to encrypt data. I am learning how to use it however I cannot get past the first basic tutorial. I am unable to convert the encrypted data back to a String - which kind of defeats the purpose of encrypting it in the first place if I cannot legibly decrypt the data
Code:

let string = "Hi. This is Atlas"

let input: [UInt8] = Array(string.utf8)

print(input)

let key: [UInt8] = [0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00]

let iv: [UInt8] = AES.randomIV(AES.blockSize)

do {
    let encryptedBytes: [UInt8] = try AES(key: key, iv: iv, blockMode: .CBC).encrypt(input, padding: PKCS7())

    print(encryptedBytes)

    let decrypted: [UInt8] = try AES(key: key, iv: iv, blockMode: .CBC).decrypt(encryptedBytes, padding: PKCS7())

    print(decrypted) // << need to convert this array of byted to a string (should be equal to original input)
} catch {
} catch {
}

Thank you for the help

atlas81887
  • 175
  • 1
  • 1
  • 7

2 Answers2

4

You'll want Foundation to decode the UTF8 for you since there's no way to generate a String.UTF8View directly. So convert to NSData first.

let decrypted: [UInt8] = [0x48, 0x65, 0x6c, 0x6c, 0x6f]
let data = NSData(bytes: decrypted, length: decrypted.count)
let str = String(data: data, encoding: NSUTF8StringEncoding)

If you want to do it without Foundation, you can, but it's a little work. You have to manage the decoding yourself.

extension String {
    init?(utf8Bytes: [UInt8]) {
        var decoder = UTF8()
        var g = utf8Bytes.generate()
        var characters: [Character] = []
        LOOP:
            while true {
                let result = decoder.decode(&g)
                switch result {
                case .Result(let scalar): characters.append(Character(scalar))
                case .EmptyInput: break LOOP
                case .Error: return nil
                }
        }
        self.init(characters)
    }
}

let unicode = String(utf8Bytes: bytes)

(I'm very surprised that this isn't built into Swift stdlib since it's so common and can be quickly built out of other parts of Swift stdlib. Often when that's the case, there's a reason that I'm just not aware of yet, so there may be some subtle problem with my approach here.)

Sebastian Osiński
  • 2,894
  • 3
  • 22
  • 34
Rob Napier
  • 286,113
  • 34
  • 456
  • 610
  • 1
    I made a similar extension for UTF-8/16/32 here http://stackoverflow.com/a/24757284/1187415 :) – Martin R Dec 29 '15 at 20:06
  • @MartinR You seen any reason that this isn't built-in? It seems the obvious approach, but a surprising pain to implement. – Rob Napier Dec 29 '15 at 20:09
  • I have no idea (in particular since the *opposite* conversion is available as `utf8` and `utf16` property). – Btw. I tried to apply this method to http://stackoverflow.com/q/34372391/1187415, but that did not work because once the codec has reported EmptyInput you cannot feed it with more data. – Martin R Dec 29 '15 at 20:15
1
let stringDecrypted = String(decrypted.map { Character(UnicodeScalar($0)) })

So it maps each UInt8 to UnicodeScalar and then to Character. After that it uses String's initializer to create String from array of Characters.

Sebastian Osiński
  • 2,894
  • 3
  • 22
  • 34