Loading...

verticapy.read_json#

verticapy.read_json(path: str, schema: str | None = None, table_name: str | None = None, usecols: list | None = None, new_name: dict | None = None, insert: bool = False, start_point: str = None, record_terminator: str = None, suppress_nonalphanumeric_key_chars: bool = False, reject_on_materialized_type_error: bool = False, reject_on_duplicate: bool = False, reject_on_empty_key: bool = False, flatten_maps: bool = True, flatten_arrays: bool = False, temporary_table: bool = False, temporary_local_table: bool = True, gen_tmp_table_name: bool = True, ingest_local: bool = True, genSQL: bool = False, materialize: bool = True, use_complex_dt: bool = False, is_avro: bool = False) vDataFrame#

Ingests a JSON file using flex tables.

Parameters#

path: str

Absolute path where the JSON file is located.

schema: str, optional

Schema where the JSON file will be ingested.

table_name: str, optional

Final relation name.

usecols: list, optional

list of the JSON parameters to ingest. The other parameters will be ignored. If empty, all the JSON parameters will be ingested.

new_name: dict, optional

Dictionary of the new column names. If the JSON file is nested, it is recommended to change the final names because special characters will be included in the new column names. For example, {"param": {"age": 3, "name": Badr}, "date": 1993-03-11} will create 3 columns: “param.age”, “param.name” and “date”. You can rename these columns using the new_name parameter with the following dictionary: {"param.age": "age", "param.name": "name"}

insert: bool, optional

If set to True, the data is ingested into the input relation. The JSON parameters must be the same as the input relation otherwise they will not be ingested. If set to True, table_name cannot be empty.

start_point: str, optional

str, name of a key in the JSON load data at which to begin parsing. The parser ignores all data before the start_point value. The value is loaded for each object in the file. The parser processes data after the first instance, and up to the second, ignoring any remaining data.

record_terminator: str, optional

When set, any invalid JSON records are skipped and parsing continues with the next record. Records must be terminated uniformly. For example, if your input file has JSON records terminated by newline characters, set this parameter to \n. If any invalid JSON records exist, parsing continues after the next record_terminator. Even if the data does not contain invalid records, specifying an explicit record terminator can improve load performance by allowing cooperative parse and apportioned load to operate more efficiently. When you omit this parameter, parsing ends at the first invalid JSON record.

suppress_nonalphanumeric_key_chars: bool, optional

boolean, whether to suppress non-alphanumeric characters in JSON key values. The parser replaces these characters with an underscore (_) when this parameter is True.

reject_on_materialized_type_error: bool, optional

boolean, whether to reject a data row that contains a materialized column value that cannot be coerced into a compatible data type. If the value is False and the type cannot be coerced, the parser sets the value in that column to None. If the column is a strongly-typed complex type, as opposed to a flexible complex type, then a type mismatch anywhere in the complex type causes the entire column to be treated as a mismatch. The parser does not partially load complex types.

reject_on_duplicate: bool, optional

boolean, whether to ignore duplicate records (False), or to reject duplicates (True). In either case, the load continues.

reject_on_empty_key: bool, optional

boolean, whether to reject any row containing a field key without a value.

flatten_maps: bool, optional

boolean, whether to flatten sub-maps within the JSON data, separating map levels with a period (.). This value affects all data in the load, including nested maps.

flatten_arrays: bool, optional

boolean, whether to convert lists to sub-maps with integer keys. When lists are flattened, key names are concatenated in the same way as maps. lists are not flattened by default. This value affects all data in the load, including nested lists.

temporary_table: bool, optional

If set to True, a temporary table will be created.

temporary_local_table: bool, optional

If set to True, a temporary local table will be created. The parameter schema must be empty, otherwise this parameter is ignored.

gen_tmp_table_name: bool, optional

Sets the name of the temporary table. This parameter is only used when the parameter temporary_local_table is set to True and if the parameters table_name and schema are unspecified.

ingest_local: bool, optional

If set to True, the file will be ingested from the local machine.

genSQL: bool, optional

If set to True, the SQL code for creating the final table is generated but not executed. This is a good way to change the final relation types or to customize the data ingestion.

materialize: bool, optional

If set to True, the flex table is materialized into a table. Otherwise, it will remain a flex table. Flex tables simplify the data ingestion but have worse performace compared to regular tables.

use_complex_dt: bool, optional

boolean, whether the input data file has complex structure. If set to True, most of the other parameters are ignored.

Returns#

vDataFrame

The vDataFrame of the relation.

Examples#

In this example, we will first create a JSON file using vDataFrame.to_json() and ingest it into Vertica database.

We import verticapy:

import verticapy as vp

Hint

By assigning an alias to verticapy, we mitigate the risk of code collisions with other libraries. This precaution is necessary because verticapy uses commonly known function names like “average” and “median”, which can potentially lead to naming conflicts. The use of an alias ensures that the functions from verticapy are used as intended without interfering with functions from other libraries.

We will use the Titanic dataset.

import verticapy.datasets as vpd

data = vpd.load_titanic()
123
pclass
Integer
123
survived
Integer
Abc
Varchar(164)
Abc
sex
Varchar(20)
123
age
Numeric(8)
123
sibsp
Integer
123
parch
Integer
Abc
ticket
Varchar(36)
123
fare
Numeric(12)
Abc
cabin
Varchar(30)
Abc
embarked
Varchar(20)
Abc
boat
Varchar(100)
123
body
Integer
Abc
home.dest
Varchar(100)
110female2.012113781151.55C22 C26S[null][null]Montreal, PQ / Chesterville, ON
210male30.012113781151.55C22 C26S[null]135Montreal, PQ / Chesterville, ON
310female25.012113781151.55C22 C26S[null][null]Montreal, PQ / Chesterville, ON
410male39.0001120500.0A36S[null][null]Belfast, NI
510male71.000PC 1760949.5042[null]C[null]22Montevideo, Uruguay
610male47.010PC 17757227.525C62 C64C[null]124New York, NY
710male[null]00PC 1731825.925[null]S[null][null]New York, NY
810male24.001PC 17558247.5208B58 B60C[null][null]Montreal, PQ
910male36.0001305075.2417C6CA[null]Winnipeg, MN
1010male25.0001390526.0[null]C[null]148San Francisco, CA
1110male45.00011378435.5TS[null][null]Trenton, NJ
1210male42.00011048926.55D22S[null][null]London / Winnipeg, MB
1310male41.00011305430.5A21S[null][null]Pomeroy, WA
1410male48.000PC 1759150.4958B10C[null]208Omaha, NE
1510male[null]0011237939.6[null]C[null][null]Philadelphia, PA
1610male45.00011305026.55B38S[null][null]Washington, DC
1710male[null]0011379831.0[null]S[null][null][null]
1810male33.0006955.0B51 B53 B55S[null][null]New York, NY
1910male28.00011305947.1[null]S[null][null]Montevideo, Uruguay
2010male17.00011305947.1[null]S[null][null]Montevideo, Uruguay
2110male49.0001992426.0[null]S[null][null]Ascot, Berkshire / Rochester, NY
2210male36.0101987778.85C46S[null]172Little Onn Hall, Staffs
2310male46.010W.E.P. 573461.175E31S[null][null]Amenia, ND
2410male[null]001120510.0[null]S[null][null]Liverpool, England / Belfast
2510male27.01013508136.7792C89C[null][null]Los Angeles, CA
2610male[null]0011046552.0A14S[null][null]Stoughton, MA
2710male47.000572725.5875E58S[null][null]Victoria, BC
2810male37.011PC 1775683.1583E52C[null][null]Lakewood, NJ
2910male[null]0011379126.55[null]S[null][null]Roachdale, IN
3010male70.011WE/P 573571.0B22S[null]269Milwaukee, WI
3110male39.010PC 1759971.2833C85C[null][null]New York, NY
3210male31.010F.C. 1275052.0B71S[null][null]Montreal, PQ
3310male50.010PC 17761106.425C86C[null]62Deephaven, MN / Cedar Rapids, IA
3410male39.000PC 1758029.7A18C[null]133Philadelphia, PA
3510female36.000PC 1753131.6792A29C[null][null]New York, NY
3610male[null]00PC 17483221.7792C95S[null][null][null]
3710male30.00011305127.75C111C[null][null]New York, NY
3810male19.03219950263.0C23 C25 C27S[null][null]Winnipeg, MB
3910male64.01419950263.0C23 C25 C27S[null][null]Winnipeg, MB
4010male[null]0011377826.55D34S[null][null]Westcliff-on-Sea, Essex
4110male[null]001120580.0B102S[null][null][null]
4210male37.01011380353.1C123S[null][null]Scituate, MA
4310male47.00011132038.5E63S[null]275St Anne's-on-Sea, Lancashire
4410male24.000PC 1759379.2B86C[null][null][null]
4510male71.000PC 1775434.6542A5C[null][null]New York, NY
4610male38.001PC 17582153.4625C91S[null]147Winnipeg, MB
4710male46.000PC 1759379.2B82 B84C[null][null]New York, NY
4810male[null]0011379642.4[null]S[null][null][null]
4910male45.0103697383.475C83S[null][null]New York, NY
5010male40.0001120590.0B94S[null]110[null]
5110male55.0111274993.5B69S[null]307Montreal, PQ
5210male42.00011303842.5B11S[null][null]London / Middlesex
5310male[null]001746351.8625E46S[null][null]Brighton, MA
5410male55.00068050.0C39S[null][null]London / Birmingham
5510male42.01011378952.0[null]S[null]38New York, NY
5610male[null]00PC 1760030.6958[null]C14[null]New York, NY
5710female50.000PC 1759528.7125C49C[null][null]Paris, France New York, NY
5810male46.00069426.0[null]S[null]80Bennington, VT
5910male50.00011304426.0E60S[null][null]London
6010male32.500113503211.5C132C[null]45[null]
6110male58.0001177129.7B37C[null]258Buffalo, NY
6210male41.0101746451.8625D21S[null][null]Southington / Noank, CT
6310male[null]0011302826.55C124S[null][null]Portland, OR
6410male[null]00PC 1761227.7208[null]C[null][null]Chicago, IL
6510male29.00011350130.0D6S[null]126Springfield, MA
6610male30.00011380145.5[null]S[null][null]London / New York, NY
6710male30.00011046926.0C106S[null][null]Brockton, MA
6810male19.01011377353.1D30S[null][null]New York, NY
6910male46.0001305075.2417C6C[null]292Vancouver, BC
7010male54.0001746351.8625E46S[null]175Dorchester, MA
7110male28.010PC 1760482.1708[null]C[null][null]New York, NY
7210male65.0001350926.55E38S[null]249East Bridgewater, MA
7310male44.0201992890.0C78Q[null]230Fond du Lac, WI
7410male55.00011378730.5C30S[null][null]Montreal, PQ
7510male47.00011379642.4[null]S[null][null]Washington, DC
7610male37.001PC 1759629.7C118C[null][null]Brooklyn, NY
7710male58.00235273113.275D48C[null]122Lexington, MA
7810male64.00069326.0[null]S[null]263Isle of Wight, England
7910male65.00111350961.9792B30C[null]234Providence, RI
8010male28.500PC 1756227.7208D43C[null]189?Havana, Cuba
8110male[null]001120520.0[null]S[null][null]Belfast
8210male45.50011304328.5C124S[null]166Surbiton Hill, Surrey
8310male23.0001274993.5B24S[null][null]Montreal, PQ
8410male29.01011377666.6C2S[null][null]Isleworth, England
8510male18.010PC 17758108.9C65C[null][null]Madrid, Spain
8610male47.00011046552.0C110S[null]207Worcester, MA
8710male38.000199720.0[null]S[null][null]Rotterdam, Netherlands
8810male22.000PC 17760135.6333[null]C[null]232[null]
8910male[null]00PC 17757227.525[null]C[null][null][null]
9010male31.000PC 1759050.4958A24S[null][null]Trenton, NJ
9110male[null]0011376750.0A32S[null][null]Seattle, WA
9210male36.0001304940.125A10C[null][null]Winnipeg, MB
9310male55.010PC 1760359.4[null]C[null][null]New York, NY
9410male33.00011379026.55[null]S[null]109London
9510male61.013PC 17608262.375B57 B59 B63 B66C[null][null]Haverford, PA / Cooperstown, NY
9610male50.0101350755.9E44S[null][null]Duluth, MN
9710male56.00011379226.55[null]S[null][null]New York, NY
9810male56.0001776430.6958A7C[null][null]St James, Long Island, NY
9910male24.0101369560.0C31S[null][null]Huntington, WV
10010male[null]0011305626.0A19S[null][null]Streatham, Surrey
Rows: 1-100 | Columns: 14

Note

VerticaPy offers a wide range of sample datasets that are ideal for training and testing purposes. You can explore the full list of available datasets in the Datasets, which provides detailed information on each dataset and how to use them effectively. These datasets are invaluable resources for honing your data analysis and machine learning skills within the VerticaPy environment.

Let’s convert the vDataFrame to a JSON file.

data[0:20].to_json(
    path = "titanic_subset.json",
)

Let’s ingest the json file into the Vertica database.

from verticapy.core.parsers.json import read_json

read_json(
    path = "titanic_subset.json",
    table_name = "titanic_subset",
    schema = "public",
)
Abc
boat
Varchar(20)
123
body
Integer
Abc
cabin
Varchar(22)
123
age
Numeric(10)
Abc
home.dest
Varchar(62)
Abc
embarked
Varchar(20)
123
fare
Numeric(13)
Abc
Varchar(94)
123
parch
Integer
123
pclass
Integer
Abc
sex
Varchar(20)
123
sibsp
Integer
123
survived
Integer
Abc
ticket
Varchar(20)
1A[null]C636.0Winnipeg, MNC75.241701male0013050
2[null][null]A2141.0Pomeroy, WAS30.501male00113054
3[null][null]A3639.0Belfast, NIS0.001male00112050
4[null][null]B3845.0Washington, DCS26.5501male00113050
5[null][null]B51 B53 B5533.0New York, NYS5.001male00695
6[null][null]B58 B6024.0Montreal, PQC247.520811male00PC 17558
7[null][null]C22 C262.0Montreal, PQ / Chesterville, ONS151.5521female10113781
8[null][null]C22 C2625.0Montreal, PQ / Chesterville, ONS151.5521female10113781
9[null][null]D2242.0London / Winnipeg, MBS26.5501male00110489
10[null][null]T45.0Trenton, NJS35.501male00113784
11[null][null][null][null]New York, NYS25.92501male00PC 17318
12[null][null][null][null]Philadelphia, PAC39.601male00112379
13[null][null][null][null][null]S31.001male00113798
14[null][null][null]17.0Montevideo, UruguayS47.101male00113059
15[null][null][null]28.0Montevideo, UruguayS47.101male00113059
16[null]22[null]71.0Montevideo, UruguayC49.504201male00PC 17609
17[null]124C62 C6447.0New York, NYC227.52501male10PC 17757
18[null]135C22 C2630.0Montreal, PQ / Chesterville, ONS151.5521male10113781
19[null]148[null]25.0San Francisco, CAC26.001male0013905
20[null]208B1048.0Omaha, NEC50.495801male00PC 17591
Rows: 1-20 | Columns: 14

Let’s ingest the json and rename some columns.

read_json(
    path = "titanic_subset.json",
    table_name = "titanic_sub_newnames",
    schema = "public",
    new_name = {
        "fields.fare": "fare",
        "fields.sex": "sex",
    },
)
Abc
boat
Varchar(20)
123
body
Integer
Abc
cabin
Varchar(22)
123
age
Numeric(10)
Abc
home.dest
Varchar(62)
Abc
embarked
Varchar(20)
123
fare
Numeric(13)
Abc
Varchar(94)
123
parch
Integer
123
pclass
Integer
Abc
sex
Varchar(20)
123
sibsp
Integer
123
survived
Integer
Abc
ticket
Varchar(20)
1A[null]C636.0Winnipeg, MNC75.241701male0013050
2[null][null]A2141.0Pomeroy, WAS30.501male00113054
3[null][null]A3639.0Belfast, NIS0.001male00112050
4[null][null]B3845.0Washington, DCS26.5501male00113050
5[null][null]B51 B53 B5533.0New York, NYS5.001male00695
6[null][null]B58 B6024.0Montreal, PQC247.520811male00PC 17558
7[null][null]C22 C262.0Montreal, PQ / Chesterville, ONS151.5521female10113781
8[null][null]C22 C2625.0Montreal, PQ / Chesterville, ONS151.5521female10113781
9[null][null]D2242.0London / Winnipeg, MBS26.5501male00110489
10[null][null]T45.0Trenton, NJS35.501male00113784
11[null][null][null][null]New York, NYS25.92501male00PC 17318
12[null][null][null][null]Philadelphia, PAC39.601male00112379
13[null][null][null][null][null]S31.001male00113798
14[null][null][null]17.0Montevideo, UruguayS47.101male00113059
15[null][null][null]28.0Montevideo, UruguayS47.101male00113059
16[null]22[null]71.0Montevideo, UruguayC49.504201male00PC 17609
17[null]124C62 C6447.0New York, NYC227.52501male10PC 17757
18[null]135C22 C2630.0Montreal, PQ / Chesterville, ONS151.5521male10113781
19[null]148[null]25.0San Francisco, CAC26.001male0013905
20[null]208B1048.0Omaha, NEC50.495801male00PC 17591
Rows: 1-20 | Columns: 14

Let’s ingest only two columns from the json.

read_json(
    path = "titanic_subset.json",
    table_name = "titanic_sub_usecols",
    schema = "public",
    usecols  = [
        "fields.fare",
        "fields.sex",
    ],
)
Abc
boat
Varchar(20)
123
body
Integer
Abc
cabin
Varchar(22)
123
age
Numeric(10)
Abc
home.dest
Varchar(62)
Abc
embarked
Varchar(20)
123
fare
Numeric(13)
Abc
Varchar(94)
123
parch
Integer
123
pclass
Integer
Abc
sex
Varchar(20)
123
sibsp
Integer
123
survived
Integer
Abc
ticket
Varchar(20)
1A[null]C636.0Winnipeg, MNC75.241701male0013050
2[null][null]A2141.0Pomeroy, WAS30.501male00113054
3[null][null]A3639.0Belfast, NIS0.001male00112050
4[null][null]B3845.0Washington, DCS26.5501male00113050
5[null][null]B51 B53 B5533.0New York, NYS5.001male00695
6[null][null]B58 B6024.0Montreal, PQC247.520811male00PC 17558
7[null][null]C22 C262.0Montreal, PQ / Chesterville, ONS151.5521female10113781
8[null][null]C22 C2625.0Montreal, PQ / Chesterville, ONS151.5521female10113781
9[null][null]D2242.0London / Winnipeg, MBS26.5501male00110489
10[null][null]T45.0Trenton, NJS35.501male00113784
11[null][null][null][null]New York, NYS25.92501male00PC 17318
12[null][null][null][null]Philadelphia, PAC39.601male00112379
13[null][null][null][null][null]S31.001male00113798
14[null][null][null]17.0Montevideo, UruguayS47.101male00113059
15[null][null][null]28.0Montevideo, UruguayS47.101male00113059
16[null]22[null]71.0Montevideo, UruguayC49.504201male00PC 17609
17[null]124C62 C6447.0New York, NYC227.52501male10PC 17757
18[null]135C22 C2630.0Montreal, PQ / Chesterville, ONS151.5521male10113781
19[null]148[null]25.0San Francisco, CAC26.001male0013905
20[null]208B1048.0Omaha, NEC50.495801male00PC 17591
Rows: 1-20 | Columns: 14

Note

You can ingest multiple JSON files into the Vertica database by using the following syntax.

read_json(
    path = "*.json",
    table_name = "titanic_multi_files",
    schema = "public",
)

See also

read_file() : Ingests an input file into the Vertica DB.
read_avro() : Ingests a AVRO file into the Vertica DB.
read_csv() : Ingests a CSV file into the Vertica DB.
read_pandas() : Ingests the pandas.DataFrame into the Vertica DB.