hive - Decimal type columns in Spark -
i have questions regarding decimal type columns.
- i not understand why got decimal type columns, , got double type column.
i did division operation in spark sql, looks like:
select a/b c table
is because small number generated in data, spark determine column type decimal type?
- i got error when trying save table hive, using saveastable method:
java.lang.unsupportedoperationexception: parquet not support decimal.hiveexception: java.lang.unsupportedoperationexception:
parquet not support decimal. see hive-6384
what should do? convert column type double? there automatic way handle this, or tell spark use double type instead of decimal type @ time?
Comments
Post a Comment