356 private links
The parsers are different in JS, Python, Go and Java.
Number are not precise:
- MAX_SAFE_INTEGER limits the number. Twitter had to use an `id_str.
- decimal precision is unreliable (in JS) --> always use dedicated decimal types (Python’s Decimal, Java’s BigDecimal, JavaScript’s decimal libraries)
- UTF-8 encoding in JSON allow single unicode code points or composed ones. Use
.normalize("NFC")
for JS strings. - the object key order should be alphabetically in JSON.
- Different languages handle absence of values (
undefined
,null
or a missing property) differently. - No time format is official, so it's always custom:
{ "iso_string": "2023-01-15T10:30:00.000Z", "unix_timestamp": 1673780200, "unix_milliseconds": 1673780200000, "date_only": "2023-01-15", "custom_format": "15/01/2023 10:30:00" }
- Different parsers fail differently on malformed JSON.
The twitter example is only one. There is also postgres that stores the format as JSON and JSONB (normalized).
MongoDB uses an extended JSON format.
Workarounds:
- Use Schema Validation!
- custom normalisation function#:~:text=Normalize%20Data%20Types%3A%20Ender%E2%80%99s%20Data%20Normalization%20Game)
- Tests! Numeric Precision Tests, Unicode and String Handling, Date and Time Consistency, Error Handling Uniformity, Cryptographic Consistency, Performance and Memory Behavior
Ok interesting
import a json file that can be manipulated with the unix filesystem tools and rexport it to json.
Describe a JSON structure
The post covers the JSON format with different topics.
All that we did to get this speedup is implement the Serialize trait using one line for the body of the serialize method!
But implementing the trait directly loses the possibility to serialize the structure with the #derive(Serialize) macro.
Instead, you should implement it on wrapper types that act like formatters.
Also for efficiency: format_args!
doesn't allocate or even apply the formatting! It only returns Arguments which is a formatter that borrows its arguments.
An extension of JSON to allow one valid JSON entity per line.
It optimize the parsing because the entire JSON file does not have to be loaded first.
Simple to use local JSON database. Use native JavaScript API to query. Written in TypeScript. owl
Seamlessly visualize your JSON data instantly into graphs.
A minimal and simple JSON schema. The syntax is actually easy 👍
Creating virtual columns in SQLite to greatly improve SQL queries. So
however I don't think that storing JSON as-is is a good idea...
But yes, it is almost a NoSQL database ツ
TL;DR you have to read the entire blob binary and create the appropriate data structure before filtering whereas you can skip parts with JSON, and thus make it faster.
JSON, however, allows stream parsing. Consider for example, the Jackson library’s [nextToken()](https://fasterxml.github.io/jackson-core/javadoc/2.8/com/fasterxml/jackson/core/JsonParser.html#nextToken()) or Go’s json.Tokenizer.
Sadly the code produced for Amazon Web Services is not open-source yet.
Comment bien commencer avec AJV afin de valider des fichiers JSON selon un certain format.
Ce qu'il manque à ce guide de départ est la manière dont on écris les schémas !