-
-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle Large Integers via native BigInt in the TypeScript client #101
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great solution, hopefully your PR gets merged and we can loose the fork dependency.
Do you want me to adjust the client functions?
I don't mind if you do, though we need to adjust the Json shema first, if we want to keep some numbers in there. It'd also be fine to stick to bigint everywhere I suppose? |
I think we should implement them selectively https://betterprogramming.pub/the-downsides-of-bigints-in-javascript-6350fd807d |
@rhyslbw there's currently an annoying thing with the
Which means that, even though we may define a particular field or type as See the following example: import JSONBig from 'json-bigint'
const safeJSON = JSONBig({ useNativeBigInt: true })
interface MyRecord {
a: number;
b: bigint;
}
const example1 = '{ "a": 14, "b": 42 }'
const example2 = '{ "a": 14, "b": 43543763985612362153 }'
const record1 = safeJSON.parse(example1) as MyRecord
const record2 = safeJSON.parse(example2) as MyRecord
console.log(typeof record1.b)
// number
console.log(typeof record2.b)
// bigint
/* This one is commented-out because it is a (legit) compilation error. */
// console.log(record2.b + record1.a)
//
// TSError: ⨯ Unable to compile TypeScript:
// tmp.ts:22:13 - error TS2365: Operator '+' cannot be applied to types 'bigint' and 'number'.
/* This one compiles, but gives a runtime error because the _real_ underlying types of `b` in both records are different! */
console.log(record1.b + record2.b)
// console.log(record1.b + record2.b)
// ^
// TypeError: Cannot mix BigInt and other types, use explicit conversions Turning parse :: ByteString -> Maybe a where |
The only "okay-ish" option I see is to not declare types as |
Seems like there's close to no support for runtime type reflection in TypeScript.. I've found: https://www.npmjs.com/package/reflec-ts which gets us into a slippery slope I am not a fond of. So it seems like the For example, this works: import JSONBig from 'json-bigint'
const safeJSON = JSONBig({ useNativeBigInt: true })
interface MyRecord {
a: number;
b: number | bigint;
}
const example1 = '{ "a": 14, "b": 42 }'
const example2 = '{ "a": 14, "b": 43543763985612362153 }'
const record1 = safeJSON.parse(example1) as MyRecord
const record2 = safeJSON.parse(example2) as MyRecord
function sum (a: number | bigint, b: number | bigint) : number | bigint {
if (typeof a === 'number' && typeof b === 'number') {
return a + b
}
if (typeof a === 'number' && typeof b === 'bigint') {
return BigInt(a) + b
}
if (typeof a === 'bigint' && typeof b === 'number') {
return a + BigInt(b)
}
return (a as bigint) + (b as bigint)
}
console.log(typeof record1.b)
// number
console.log(typeof record2.b)
// bigint
console.log(sum(record2.b, record1.a))
// 43543763985612362167n
console.log(sum(record1.b, record2.b))
// 43543763985612362195n |
JSON 'integer' with no maximum boundary will now be generated as 'bigint', whereas the other will remain as 'number'. Note that this is _relatively_ safe even though many integer types are actually uint64, ranging from 0 to 2^64-1, whereas JavaScript can only truly represent numbers up to 2^53-1. However, in practice, all those bounded numbers will also be smaller than 2^53-1, but may definitely be bigger than a uint32 and requires therefore a uint64 internally.
Some idea (not my most elegant code...), write a little wrapper around the Json parser to sanitize a parsed result, based on some whitelisting. In the end, there aren't many interfaces which require bigints (namely, assets and metadatum) thus it could work like this: https://gist.github.com/KtorZ/010b8ea3e703cc669f762635c912c959 The drawbacks obviously is a performance hit for every parsed results which needs to be traversed an additional time, and, it completely escapes the compiler's watch. If we add new interfaces which require bigint, we need to remember to add them there. On the plus side, it's transparent and neat for the library users. |
Agree with the assessment here, and your solution is very reasonable solution given the constraints. Without the coercion handled internally it's a complex mess for implementations to handle. |
So, I ended up implementing this for now: 84cf365 (not fully tested yet though...) Which I find only mildly satisfactory. Yet, it should get us going for the time being and, we have to remember being careful about possible new bigint types. In the meantime, I had a look at the json-bigint library which indeed seems a bit abandoned. Beside our little issue here, I found at least two other issues which are worth fixing:
So, all-in-all, I think it's worth forking the library anyway if only for these two changes. Then, I've started working on an implementation which would give us an API like: const _1 = JSONBig.parse(
'{ "foo_1": 14, "foo_2": 42 }',
schema.properties["Foo"]
)
console.log(_1);
// { foo_1: 14n, foo_2: 42 }
const _2 = JSONBig.parse(
'[{ "bar": 42, "foo": 42 }, { "bar": 14, "foo": 14 }]',
schema.properties["Bar"]
)
console.log(_2);
// [ { bar: 42, foo: 42n }, { bar: 14, foo: 14n } ] which I find pretty neat. Since we know the expected schema when parsing, this would be quite easy to implement. However, my current hack does not handle the compound combinators well yet (anyOf, oneOf & allOf) which are quite challenging to handle correctly (since they can potentially lead to a choice between multiple candidate schemas). So, I'd be happy to continue working on that as a separate stream of work, while we still get going with the current fix and the post-processing. What do you think @rhyslbw ? |
I'm happy with this result and agree we should decouple the optimisations. |
The BigInt library has no way to know whether a type should be a normal number or a bigint when parsing, because there's no reflection available to JS code at runtime. Any type information from TypeScript is gone, and therefore, the library makes the choice of using bigint only when necessary. This is in practice annoying, as it causes TypeScript to behave as if some type was a BigInt, whereas they are in reality parsed as _number_, causing runtime errors upon the first manipulation. We have explored three possible solutions, and this commit implements the second one (while the third one is being worked but is more involved). 1/ Return types as `number | bigint`, since this is what they are in the end. This pushes the concern down to the consumer which will have to deal with it in a quite ugly way. 2/ Perform a second pass on the parsed data, to turn into bigint numbers which _should be_ (according to their type definition), bigint, but have been serialized as numbers. Not fully ideal because it requires to re-process the entire JSON object, and, it makes strong assumption on the shape of the JSON object to identify which fields should be turned into bigint and which one should be preserved. In the long run, this isn't very maintainable as it'll break as soon as we add a new field with a bigint type and forget to add it to the post-process checks. 3/ Fork the bigint library, such that a JSON-schema can be given to the parser, which it can traverse while parsing a string, and consult to figure out _which_ type should a number be deserialized into. This is really non-trivial since JSON-schema supports compound operators such as `anyOf`, `oneOf` and `allOf`, requiring the parser to maintain a tree of possibilities which it attempts to narrow down as it traverse the tree.
📍 Use fork / temporary version of json-schema-to-typescript including the bigint fix.
See Add optional support for TypeScript's bigint bcherny/json-schema-to-typescript#401
📍 Use json-bigint parser and serialiser with native bigint.
TODO: