![]() ![]() Get data type of multiple column in pyspark using dtypes : Method 2ĭlect(‘columnname1′,’columnname2’).dtypes is used to select data type of multiple columns df_lect('Price','Item_name').dtypes So in our case we get the data type of ‘Price’ and ‘Item_name’ column as shown above We use select function to select multiple columns and use printSchema() function to get data type of these columns. Get data type of multiple column in pyspark : Method 1ĭlect(‘columnname1′,’columnname2’).printSchema() is used to select data type of multiple columns df_lect('Price','Item_name').printSchema() So in our case we get the data type of ‘Price’ column as shown above. We use select function to select a column and use dtypes to get data type of that particular column. Get data type of single column in pyspark using dtypes – Method 2ĭlect(‘columnname’).dtypes is syntax used to select data type of single column df_lect('Price').dtypes We use select function to select a column and use printSchema() function to get data type of that particular column. Get data type of single column in pyspark using printSchema() – Method 1ĭlect(‘columnname’).printschema() is used to select data type of single column df_lect('Price').printSchema() We will use the dataframe named df_basket1. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |