How does Swift's int literal to float inference work?

How does Swift's int literal to float inference work?



Swift can infer int literals to be doubles or floats


let x: Float = 3



It even works with arithmetic. It will convert everything before doing the math, so this is also 3:


let y: Float = 5/2 + 0.5



But what are the actual rules for this? There are ambiguous situations, for example if the inference is for a parameter:


func foobar(_ x: Int) -> Int
return x


func foobar(_ x: Float) -> Float
return y


foobar(1/2)



In this case it infers it as an int and returns 0, but if you delete the first function it switches to a float and returns 0.5.



What are the rules? Where is it documented?



Even more annoying is when Swift could infer it as a float but doesn't


func foobar(_ x: Float) -> Float
return x


let x = 1/2
foobar(x) // Cannot convert value of type 'Int' to expected argument type 'Float'






Related: Strange Swift numbers type casting: 1 and 2 can be both integer literals and floating point literals.

– Martin R
Sep 18 '18 at 18:17



1


2






Unfortunately I don't believe the ambiguity case is officially documented – but the rule is that the compiler prefers to use the default type for a literal where it can. The default type being Int for an integer literal and Double for a floating-point literal (glossing over IntegerLiteralType). When ranking solutions, the compiler assigns a score to each to work out which it should favour (higher score is worse), and "non-default literal" is a component of that score.

– Hamish
Sep 18 '18 at 18:44


Int


Double


IntegerLiteralType






Re: your last example – Swift doesn't infer types across multiple statements (the constraint solver operates on a single expression with a potential contextual type to convert the expression type to).

– Hamish
Sep 18 '18 at 18:46





3 Answers
3



Literals don't have a type as such. The docs say,



If there isn’t suitable type information available, Swift infers that the
literal’s type is one of the default literal
types defined in the Swift standard library. The default types are Int
for integer literals, Double for floating-point literals, String for
string literals, and Bool for Boolean literals.



So unless your argument explicity says anything other than Int, it will infer integer literals as Int.


Int


Int



Refer this for more information, Lexical Structure - Literals.






Ah, the overloaded function makes its type ambiguous, so it picks the default type, so it finds the matching declaration.

– Max
Sep 18 '18 at 18:43



There are two Swift behaviors at play here:


Int



With only one function, rule 1 applies. It sees that a float is needed, so it infers the division as float division and the int literals as floats:


func foobar(_ x: Float) -> Float
return y

foobar(1/2) // 0.5



If you overload the function, rule 1 no longer works. The type is now ambiguous so it falls back to the default type of Int, which luckily matches one of the definitions:


Int


func foobar(_ x: Int) -> Int
return x

func foobar(_ x: Float) -> Float
return y

foobar(1/2) // 0



See what happens if you make it so the default no longer works. Neither rule applies so you get an error:


func foobar(_ x: Double) -> Double
return x

func foobar(_ x: Float) -> Float
return y

foobar(1/2) // Ambiguous use of operator '/'






For all practical purposes the default type of an integer literal is Int, but just for fun it's worth noting that it's actually defined by the (undocumented) top-level type IntegerLiteralType, which can be changed by the user. Try defining typealias IntegerLiteralType = Double and seeing what happens with your last example :)

– Hamish
Sep 18 '18 at 19:16



Int


IntegerLiteralType


typealias IntegerLiteralType = Double



By default, passing in 1/2 as your argument, you are performing a calculation on two Integers which will evaluate to a result of type Integer thus the first function being used.


1/2



To have a Float, one or all of the arguments have to be of type Float so either 1.0/2 or 1/2.0 or 1.0/2.0. This will cause the second function to run instead.


Float


1.0/2


1/2.0


1.0/2.0



In let x = 1/2, x is inferred to be of type Int because both 1 and 2 are of type Int.


let x = 1/2


Int


1


2



Swift will not try to infer a Float if not indicated.



Thanks for contributing an answer to Stack Overflow!



But avoid



To learn more, see our tips on writing great answers.



Required, but never shown



Required, but never shown




By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Crossroads (UK TV series)

ữḛḳṊẴ ẋ,Ẩṙ,ỹḛẪẠứụỿṞṦ,Ṉẍừ,ứ Ị,Ḵ,ṏ ṇỪḎḰṰọửḊ ṾḨḮữẑỶṑỗḮṣṉẃ Ữẩụ,ṓ,ḹẕḪḫỞṿḭ ỒṱṨẁṋṜ ḅẈ ṉ ứṀḱṑỒḵ,ḏ,ḊḖỹẊ Ẻḷổ,ṥ ẔḲẪụḣể Ṱ ḭỏựẶ Ồ Ṩ,ẂḿṡḾồ ỗṗṡịṞẤḵṽẃ ṸḒẄẘ,ủẞẵṦṟầṓế