首页 > 数据库技术 > 详细

【原创】大叔经验分享(84)spark sql中设置hive.exec.max.dynamic.partitions无效

时间:2019-10-03 09:02:02      阅读:608      评论:0      收藏:0      [点我收藏+]

spark 2.4

 

spark sql中执行

set hive.exec.max.dynamic.partitions=10000;

后再执行sql依然会报错:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1001, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.

这个参数hive.exec.max.dynamic.partitions的默认值是1000,修改没有生效,

 

原因如下:

`HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.

The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.

 

解决方法是在启动spark-sql时设置hiveconf

spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000

 

参考:

https://issues.apache.org/jira/browse/SPARK-19881

【原创】大叔经验分享(84)spark sql中设置hive.exec.max.dynamic.partitions无效

原文:https://www.cnblogs.com/barneywill/p/11618898.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!