I've been attempting to manually upload the JSON data into BigQuery, but I keep encountering the following error message.
Error occurred while reading data, error message: Parsing error in JSON at row starting from position 0: Repeated field must be imported as a JSON array. Field: custom_fields.value.
I have already converted the file into newline delimited JSON format, so that is not the issue here. Upon inspecting the custom_field.value mentioned in the error, I found the following:
$ cat convert2working.json | jq .custom_fields[].value
0
"Basics of information security\n"
"2021"
The crux of the problem appears to lie in the fact that custom_fields.value contains data with varying types.
How can I standardize these data types? Or could you suggest an alternative solution? Preferably, I would like to stick with JavaScript.
Below is a condensed excerpt of my JSON code:
{
"id": "example",
"custom_fields": [
{
"id": "example",
"name": "Interval",
"type": "drop_down",
"type_config": {
"default": 0,
"placeholder": null,
"options": [
{
"id": "example",
"name": "yearly",
"color": null,
"orderindex": 0
}
]
},
"date_created": "1611228211303",
"hide_from_guests": false,
"value": 0,
"required": false
},
{
"id": "example",
"name": "Description",
"type": "text",
"type_config": {},
"date_created": "1611228263444",
"hide_from_guests": false,
"value": "Basics of information security\n",
"required": false
},
{
"id": "example",
"name": "Year",
"type": "number",
"type_config": {},
"date_created": "1611228275285",
"hide_from_guests": false,
"value": "2021",
"required": false
}
]
}