3

I am trying to convert typescript to Swift.

I am currently working on using stringinput, in this case the initials of a user, to a set color. In our Frontend application, the following javascript code is used:

export default function getColorFromString(value: string) {
      var hash = 0;
      if (value.length === 0) return hash;

      for (var i = 0; i < value.length; i++) {
        hash = value.charCodeAt(i) + ((hash << 5) - hash);
        hash = hash & hash;
      }
      hash = ((hash % this.colors.length) + this.colors.length) % this.colors.length;
      return this.colors[hash];
    }

this.colorsis defined as:

const colors = [
  '#e51c23',
  '#e91e63',
  '#9c27b0',
  '#673ab7',
  '#3f51b5',
  '#5677fc',
  '#03a9f4',
  '#00bcd4',
  '#009688',
  '#259b24',
  '#8bc34a',
  '#afb42b',
  '#ff9800',
  '#ff5722',
  '#795548',
  '#607d8b',
];

I am having some difficulty with porting this code to Swift. I have tried to port it, but I am struggling with compilation errors.

This is my attempt so far. Keep in mind, this doesn't compile; it just shows that I have made an attempt myself.

    func getColorFromString(value: String) -> String {
       var hash = 0

       for (index, value) in value.enumerated() {
           hash = UnicodeScalar(value[index]) + ((hash << 5) - hash);
          hash = hash & hash;
       }
    }

Could anyone help me convert this to swift?

Update: this question was closed because of similarities with How to use hex color values, but this does not answer my question. Because I am looking for information on how to convert a string to hash to a specific array index. This array index contains the actual hex values. At some point these hex values need to be converted to color and only at that point, the related questions will be useful.

  • I don't use `swift` but a `)` without a `(` should also be a syntax error in `swift` – Andreas Oct 24 '22 at 08:45
  • The question is about getting a hash value from an arbitrary string (which is then used as an index into a color table). It has nothing to do with converting strings of the form "#RRGGBB" to a UIColor (or SwiftUI Color). It is *not* a duplicate of (and completely unrelated to) https://stackoverflow.com/q/24263007/1187415 or the other suggested duplicate targets. I have therefore reopened the question. – Martin R Oct 31 '22 at 17:35
  • Thank you! Marked my solution as answer. Please let me know if you have any feedback @MartinR – Wouter Dijks Oct 31 '22 at 18:31

1 Answers1

0

My working solution:

let colorArray = [
      "#e51c23",
       "#e91e63", 
       "#9c27b0",
       "#673ab7",
       "#3f51b5",
       "#5677fc",
       "#03a9f4",
       "#00bcd4",
       "#009688",
       "#259b24",
       "#8bc34a",
       "#afb42b",
       "#ff9800",
       "#ff5722",
       "#795548",
       "#607d8b"
];
//User initials
print(getColorFromString(value: "WD", colors: colorArray))

func getColorFromString(name: String, colors: [String]) -> String {
    var hash = 0;
    let count = colors.count;
    let nameAsUnicode = name.unicodeScalars

    for unicode in nameAsUnicode {
        var bitShift = ((hash << 5) - hash)
        hash = Int(unicode.value) + bitShift;
    }
    hash = ((hash % count) + count) % count;
    return colors[Int(hash)];
}

And for SwiftUI users working with Color() my issue was that I wanted to acquire the same color as the frontend everytime. Prequisites is the order of the color array has to be the same.

static let defaultColors = [
            Color("BrightRed"),
            Color("Pink"),
            Color("Purple"),
            Color("DarkPurple"),
            Color("defBlue2"),
            Color("defBlue1"),
            Color("defBlue3"),
            Color("Cyan"),
            Color("Aqua"),
            Color("defGreen"),
            Color("BrightGreen"),
            Color("Olive"),
            Color("defOrange"),
            Color("BrightOrange"),
            Color("Brown"),
            Color("BlueGray")
        ]

    var avatarColor: Color {
        var hash = 0
        let colors = Asset.DeskieColor.defaultColors
        let count = colors.count

        for char in name.utf16 {
            let bitshift = (hash << 5) &- hash
            hash = Int(char) &+ bitshift
        }

        hash = ((hash % count) + count) % count
        return colors[abs(Int(hash))]
    }
}

extension Collection {
    public subscript(safe index: Index) -> Element? {
        indices.contains(index) ? self[index] : nil
    }
}
  • 1
    `var avatarColor: Color { ... }` uses the UTF-16 representation and the `&-` overflow operator, so that looks like the correct Swift version of your Java/TypeScript code to me. The `getColorFromString()` may crash (try it with longer strings!) and uses the Unicode scalars, so it may give different results for strings containing special characters like Emojis. (That is probably not an issue if the string is just the initials of a user. But you may want to test it with Chinese or other non-European characters.) – Martin R Nov 01 '22 at 08:51
  • Thanks for your comment. I discovered it overflows after about 15 characters. Could also be handled by using the overflow operator you explained in https://stackoverflow.com/questions/74259506/equivalent-of-swift-in-javascript/74262280#74262280 I believe. Thankfully the string issue won't be dealing with emoji's. Originally it would be names only so maybe foreign characters would pose a potential issue. Therefore my SwiftUI color solution :). – Wouter Dijks Nov 01 '22 at 09:00