Swift: Cast Any Object to Int64 = Nil

Swift: Cast Any Object to Int64 = nil

In

let dict : [String : AnyObject] = ["intValue": 1234, "stringValue" : "some text"]

the number 1234 is stored as an NSNumber object, and that can
be cast to Int, UInt, Float, ..., but not to the fixed
size integer types like Int64.

To retrieve a 64-bit value even on 32-bit platforms, you have to
go via NSNumber explicitly:

if let val = dict["intValue"] as? NSNumber {
let int64value = val.longLongValue // This is an `Int64`
print(int64value)
}

How to convert Any to Int in Swift

var i = users[0]["Age"] as Int

As GoZoner points out, if you don't know that the downcast will succeed, use:

var i = users[0]["Age"] as? Int

The result will be nil if it fails

Why am I getting nil for the Int64 value?

You should look at the raw JSON. For example, grab the data and

let string = response.data.flatMap { String(data: $0, encoding: .utf8) }
print(string ?? "Unable to get String representation of the data")

I'm wagering that the numbers will be in quotes meaning that they're really string representations of numeric values, not actually numbers:

[{"noofcars": "2", "cllstname": "Smith", ...

Note, I'm asking you to look at the raw JSON, not just the result which has already been decoded. We need to see how the raw JSON represented these numeric values.

Let's assuming for a second that my supposition is correct, that you're getting these values as strings. Clearly, the correct solution is that the web service should send numeric value as numbers, not as strings. But you want to work with the existing JSON, you first get the String representation for numofcars (etc.), and then convert that to an numeric value. For example:

AF.request("xxxxxxxxxxxxx", headers: headers).responseJSON { response in
guard let dictionaries = response.value as? [[String: Any]] else { return }

self.allClients = dictionaries.compactMap { dictionary -> ClientsData? in
let cllstname = dictionary["cllstname"] as? String ?? "Error"
let awards = (dictionary["noofawards"] as? String).flatMap { Int($0) } ?? 0
let cars = (dictionary["noofcars"] as? String).flatMap { Int($0) } ?? 0
let dogs = (dictionary["noofdogs"] as? String).flatMap { Int($0) } ?? 0
return ClientsData(cllstname: cllstname, noofawards: awards, noofcars: cars, noofdogs: dogs)
}
.sorted { $0.noofawards > $1.noofawards }
}

FWIW, please forgive the refactoring above, as I would recommend map or compactMap rather than building an empty array and appending values to it. It's just a bit more concise

Unable to unwrap a large Int

Numeric dictionary objects are bridged to NSNumber.

Cast the value to NSNumber and get the longLongValue

if let epochTime = postData["createdDate"] as? NSNumber {
let longNumber : Int64 = epochTime.longLongValue
}

Cast String to Int64 causes crash on 32bit devices

It's not the problem of Int64.init(_:) but the problem of the format given to NSPredicate.

Length specifier l means its argument needs to be long or unsigned long, which are equivalent to Int or UInt in Swift.

String Format Specifiers

If you want to use Int64 value as a format argument, the right length specifier is ll, meaning long long which is equivalent to Int64 in Swift.

let predicateBarcode = NSPredicate(format: "barcode = %lld", Int64(searchTerm)!)

You may need to fix some other parts, but I cannot see as you are hiding other parts. (And as far as I test, I could not make my testing app crash.) In addition, are you 100%-sure about Int64(searchTerm)! may not crash?

Anyway, the format string needs to be fixed at least.

Swift code problem - trying to cast objects out from a NSMutableArray as a custom object

Better approach would be to use JSONDecoder() instead of a JSONSerailizer. Try using the following code.

struct User: Codable {
var userId, firstName, lastName, userSessionId: String

enum CodingKeys: String, CodingKey {
case userId = "Userid"
case firstName = "First_Name"
case lastName = "Last_Name"
case userSessionId = "Session_ID"
}
}

func parseJSON(_ data: Data) {
do {
let users = try JSONDecoder().decode([User].self, from: data)
users.forEach { user in
print("users first name:", user.firstName)
}
} catch {
print(error.localizedDescription)
}
}

Mismatching types 'Int64' and '_' when trying to assign optional Int64 to a dictionary key

There are a number of problems here. First, Int64 isn't an AnyObject. None of the primitive number types are classes. They can be bridged to AnyObject using NSNumber, but you don't get that bridging automatically for Int64 (see MartinR's comment. I originally said this was because it was wrapped in an Optional, but it's actually because it's fixed-width).

Next, this syntax:

(something != nil) ? something! : nil

Is just a very complicated way to say something.

The tool you want is map so that you can take your optional and convert it to a NSNumber if it exists.

dictionary["something"] = something.map(NSNumber.init)

Of course, if at all possible, get rid of the AnyObject. That type is a huge pain and causes a lot of problems. If this were a [String: Int64] you could just:

dictionary["something"] = something

Swift Casting doesn't work as Expected

it is not Swift (language) specific, it is Apple specific ...
Sample Image
Sample Image

even worst if you change data to

let d = ["first":"john", "last":"doe", "test int": 0, "test null": NSNull()] as [String:Any]

linux version works as expected,

["test null": <NSNull: 0x0000000000825560>, "last": "doe", "first": "john", "test int": 0]
["test null": <NSNull: 0x0000000000825560>, "last": "doe", "first": "john", "test int": 0]
["test int": 0, "last": "doe", "first": "john", "test null": <NSNull: 0x0000000000825560>]

but apple prints

[:]
[:]
{
first = john;
last = doe;
"test int" = 0;
"test null" = "<null>";
}

it looks very strange. next code snippet explain why

import Foundation

public protocol P {}

extension String:P {}
extension Int:P {}
extension NSNull:P {}

let d = ["first":"john", "last":"doe", "test null": NSNull(), "test int": 10] as [String:Any]
print("A)",d, type(of: d))

let d1 = d as? [String:P] ?? [:]
print("B)",d1, type(of: d1))
print()

if let data = try? JSONSerialization.data(withJSONObject: d, options: []) {

if let jobject = try? JSONSerialization.jsonObject(with: data, options: []) {

let o = jobject as? [String:Any] ?? [:]
print("1)",o, type(of: o))

var o2 = o as? [String:P] ?? [:]
print("2)",o2, type(of: o2), "is it empty?: \(o2.isEmpty)")
print()

if o2.isEmpty {
o.forEach({ (t) in
let v = t.value as? P
print("-",t.value, type(of: t.value),"as? P", v as Any)
o2[t.key] = t.value as? P ?? 0
})
}
print()
print("3)",o2)
}
}

on apple it prints

A) ["test null": <null>, "test int": 10, "first": "john", "last": "doe"] Dictionary<String, Any>
B) ["test null": <null>, "test int": 10, "first": "john", "last": "doe"] Dictionary<String, P>

1) ["test null": <null>, "test int": 10, "first": john, "last": doe] Dictionary<String, Any>
2) [:] Dictionary<String, P> is it empty?: true

- <null> NSNull as? P Optional(<null>)
- 10 __NSCFNumber as? P nil
- john NSTaggedPointerString as? P nil
- doe NSTaggedPointerString as? P nil

3) ["test null": <null>, "test int": 0, "first": 0, "last": 0]

while on linux it prints

A) ["test int": 10, "last": "doe", "first": "john", "test null": <NSNull: 0x00000000019d8c40>] Dictionary<String, Any>
B) ["test int": 10, "last": "doe", "first": "john", "test null": <NSNull: 0x00000000019d8c40>] Dictionary<String, P>

1) ["test int": 10, "last": "doe", "first": "john", "test null": <NSNull: 0x00000000019ec550>] Dictionary<String, Any>
2) ["test int": 10, "last": "doe", "first": "john", "test null": <NSNull: 0x00000000019ec550>] Dictionary<String, P> is it empty?: false

3) ["test int": 10, "last": "doe", "first": "john", "test null": <NSNull: 0x00000000019ec550>]

finally, I used the slightly modified source code of JSONSerialization from open source distribution (to avoid conflict with apple Foundation I rename the class to _JSONSerialization :-) and change the code such a way it works in my Playground without any warnings and errors and ...
it prints the expected results :)
Sample Image

Why it works now? The key is

/* A class for converting JSON to Foundation/Swift objects and converting Foundation/Swift objects to JSON.    An object that may be converted to JSON must have the following properties:
- Top level object is a `Swift.Array` or `Swift.Dictionary`
- All objects are `Swift.String`, `Foundation.NSNumber`, `Swift.Array`, `Swift.Dictionary`, or `Foundation.NSNull`
- All dictionary keys are `Swift.String`s
- `NSNumber`s are not NaN or infinity */

Now the conditional downcasting of all possible values to P works as expected

to be honest, try this snippet on linux :-) and on apple.

let d3 = [1.0, 1.0E+20]
if let data = try? JSONSerialization.data(withJSONObject: d3, options: []) {
if let jobject = try? JSONSerialization.jsonObject(with: data, options: []) as? [Double] ?? [] {
print(jobject)
}
}

apple prints

[1.0, 1e+20]

while linux

[]

and with really big value will crash. this bug goes from (in the open sourced JSONSerialization)

if doubleResult == doubleResult.rounded() {
return (Int(doubleResult), doubleDistance)
}

replace it with

if doubleResult == doubleResult.rounded() {
if doubleResult < Double(Int.max) && doubleResult > Double(Int.min) {
return (Int(doubleResult), doubleDistance)
}
}

and 'deserialization' works as expected (serialization has other bugs ...)

Cannot convert value of type 'Int64?' to expected argument type 'Int'

This is related to https://stackoverflow.com/a/32793656/8236481

Swift 3 introduces failable initializers to safely convert one integer type to another. By using init?(exactly:) you can pass one type to initialize another, and it returns nil if the initialization fails. The value returned is an optional which must be unwrapped in the usual ways.

Int(exactly: yourInt64)


Related Topics



Leave a reply



Submit