hive - Decimal type columns in Spark -


i have questions regarding decimal type columns.

  1. i not understand why got decimal type columns, , got double type column.

i did division operation in spark sql, looks like:

select a/b c table 

is because small number generated in data, spark determine column type decimal type?

  1. i got error when trying save table hive, using saveastable method:
java.lang.unsupportedoperationexception: parquet not support decimal.hiveexception: java.lang.unsupportedoperationexception: 

parquet not support decimal. see hive-6384

what should do? convert column type double? there automatic way handle this, or tell spark use double type instead of decimal type @ time?


Comments

Popular posts from this blog

ZeroMQ on Windows, with Qt Creator -

unity3d - Unity SceneManager.LoadScene quits application -

python - Error while using APScheduler: 'NoneType' object has no attribute 'now' -