0

I was reading the answer to this question - Swift native functions to have numbers as hex string - and that is exactly what I want to do. However, when I try this in a playground:

let str = String(num: 123.55, radix: 16);
print(str)

This is printed:

(123.55, 16)

Why is it formatting it that way?

The same thing happens for what it's worth if I use a whole number, E.G., 123.

Community
  • 1
  • 1
absmiths
  • 1,144
  • 1
  • 12
  • 21

3 Answers3

1

I read the conversation more closely and it was the 'num' label that broke it. If I use String(123, radix: 16) it works fine. Fractions do NOT work, however. I got a runtime error with String(123.55, radix: 16).

EDIT: As nhgrif points out below, this is now a compiler error, not a runtime error - so it should have never been allowed. This is true as of XCode Version 7.1.1 (7B1005). I believe what is going on is with the num: label on the first parameter it is creating a tuple, then calling the String initializer which converts the tuple to it's string representation. I personally would have expected that to look like this:

let str = String((num: 1, radix: 2));

Which the compiler treats the same way (same result output).

absmiths
  • 1,144
  • 1
  • 12
  • 21
  • Why you answer your question if it's not an answer really? Please try to put this in an comment if you not solve the problem yet!! – Victor Sigler Nov 11 '15 at 23:09
  • Are you pointing out that I did not mark the answer as an answer, or that I did not answer it properly? This is the actual answer to my question, though there are still many things about this I don't get. – absmiths Nov 17 '15 at 14:20
  • My point is that your answer is not complete, because the error you point regarding the `num` parameter was pointed in every answer. Nevertheless regarding the conversion of `Double` numbers to another base in my answer I point you in the direction of how to do it, because it's not supported yet. – Victor Sigler Nov 17 '15 at 14:23
  • No other answer points to the num label on the first argument of the initializer - that is the problem the root question points to so this is the answer. The other answers point to not being able to format non-integer types with a radix other than 10 - which while true, was not the original question - it was only a distraction. – absmiths Nov 17 '15 at 14:27
0

Your problem is that the Swift compiler or the init of the String type in this case has not a base conversion function implemented to convert from Double, Float to another base, you can follow some approaches explained in this question How to convert float number to Binary? adn this article and do it for yourself.

And the way of call the init is in the following way:

let num = 12355
let str = String(num, radix: 16)
print(str) // prints "3043"

I hope this help you.

Community
  • 1
  • 1
Victor Sigler
  • 23,243
  • 14
  • 88
  • 105
0

That your code even compiles (or is interpreted by the Playground) is a compiler bug. It could also be a sign that you're not on the most up to date version of Xcode, or that you need to clean & rebuild.

The initializer you are trying to use is only defined for integer types.

extension String {
    /// Create an instance representing `v` in base 10.
    public init<T : _SignedIntegerType>(_ v: T)
    /// Create an instance representing `v` in base 10.
    public init<T : UnsignedIntegerType>(_ v: T)
    /// Create an instance representing `v` in the given `radix` (base).
    ///
    /// Numerals greater than 9 are represented as roman letters,
    /// starting with `a` if `uppercase` is `false` or `A` otherwise.
    public init<T : _SignedIntegerType>(_ v: T, radix: Int, uppercase: Bool = default)
    /// Create an instance representing `v` in the given `radix` (base).
    ///
    /// Numerals greater than 9 are represented as roman letters,
    /// starting with `a` if `uppercase` is `false` or `A` otherwise.
    public init<T : UnsignedIntegerType>(_ v: T, radix: Int, uppercase: Bool = default)
}

And on my Xcode, in a Playground, attempting to use this initializer with something that isn't a _SignedIntegerType or a UnsignedIntegerType produces an error.

Cannot invoke initializer for type 'String' with an argument list of type '(Double, radix: Int)'

enter image description here

nhgrif
  • 61,578
  • 25
  • 134
  • 173
  • You are correct in that I got an XCode update yesterday which now says this is an error: `let str = String(123.55, radix: 16);` It does however still allow this: `let str = String(num: 123.55, radix: 16);` which seems weird. Maybe it thinks it is a tuple because this produces the same output: `let str2 = (num: 123.55, radix: 16); print(str2);` – absmiths Nov 17 '15 at 14:16
  • This isn't meant badly, but I found it funny when I noticed you used the label 'hex' for a binary-formatted string (base-2). – absmiths Nov 17 '15 at 14:37