hbase – 无法从phoenix表中读取大数据

嗨所有我在一个大的表上运行phoenix计数查询时收到错误信息.

0: jdbc:phoenix:hadoopm1:2181> select Count(*) from PJM_DATASET;
+------------+
|  COUNT(1)  |
+------------+

java.lang.RuntimeException: org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36, exceptions:
Fri Jan 09 02:18:10 CST 2015, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=62365: row '' on table 'PJM_DATASET' at region=PJM_DATASET,,1420633295836.4394a3aa2721f87f3e6216d20ebeec44., hostname=hadoopctrl,60020,1420790733247, seqNum=27753

    at sqlline.SqlLine$IncrementalRows.hasNext(SqlLine.java:2440)
    at sqlline.SqlLine$TableOutputFormat.print(SqlLine.java:2074)
    at sqlline.SqlLine.print(SqlLine.java:1735)
    at sqlline.SqlLine$Commands.execute(SqlLine.java:3683)
    at sqlline.SqlLine$Commands.sql(SqlLine.java:3584)
    at sqlline.SqlLine.dispatch(SqlLine.java:821)
    at sqlline.SqlLine.begin(SqlLine.java:699)
    at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
    at sqlline.SqlLine.main(SqlLine.java:424)
0: jdbc:phoenix:hadoopm1:2181>

请帮忙.

您需要在hbase服务器和客户端配置中将以下hbase配置属性增加到更高的值,默认值为1分钟(60000毫秒)

  <property>
    <name>hbase.rpc.timeout</name>
    <value>600000</value>
  </property>

最重要的是,以下命令应该返回正确的hbase config目录,其中hbase-site.xml文件包含上述属性.如果hbase_conf_path的值为空或.,执行命令export HADOOP_CONF_DIR =< correct_hbase_dir&gt ;;在执行命令sqlline.py之前指向正确的hbase配置.

phoenix_utils.py  | grep hbase_conf_path
翻译自:https://stackoverflow.com/questions/27856730/can-not-read-large-data-from-phoenix-table

转载注明原文:hbase – 无法从phoenix表中读取大数据