【Hbase异常】windows 中使用hbase 异常:java.io.IOException: Could not locate executable null\bin\winutils.exe

来源:互联网 发布:支付宝和淘宝怎么更换 编辑:程序博客网 时间:2024/06/05 06:31

平时一般是在windows环境下进行开发,在windows 环境下操作hbase可能会出现异常(java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.),以前也遇到过这个问题,今天又有小伙伴遇到这个问题,就顺带记一笔,异常信息如下:

2016-05-23 17:02:13,551 WARN [org.apache.hadoop.util.NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable2016-05-23 17:02:13,611 ERROR [org.apache.hadoop.util.Shell] - Failed to locate the winutils binary in the hadoop binary pathjava.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293)    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)    at org.apache.hadoop.conf.Configuration.getStrings(Configuration.java:1514)    at org.apache.hadoop.hbase.zookeeper.ZKConfig.makeZKProps(ZKConfig.java:113)    at org.apache.hadoop.hbase.zookeeper.ZKConfig.getZKQuorumServersString(ZKConfig.java:265)    at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:159)    at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:134)    at org.apache.hadoop.hbase.client.ZooKeeperKeepAliveConnection.<init>(ZooKeeperKeepAliveConnection.java:43)    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveZooKeeperWatcher(HConnectionManager.java:1710)    at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:82)    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:806)    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633)    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387)    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:282)    at net.shgaoxin.db.hbase.HbaseConnectionFactory.createResource(HbaseConnectionFactory.java:67)    at net.shgaoxin.db.hbase.HbaseConnectionFactory.makeObject(HbaseConnectionFactory.java:40)    at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:868)    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)    at net.shgaoxin.base.AbstractPooledContainer.get(AbstractPooledContainer.java:49)    at net.shgaoxin.db.hbase.HbaseConnectionContainer.getConnection(HbaseConnectionContainer.java:46)    at net.shgaoxin.db.hbase.HbaseConnectionContainer.getConnection(HbaseConnectionContainer.java:14)    at net.shgaoxin.db.hbase.HbaseTemplate.scan(HbaseTemplate.java:398)    at net.shgaoxin.impl.dao.hbase.GenericDaoHbaseImpl.scan(GenericDaoHbaseImpl.java:73)    at net.shgaoxin.impl.service.eastdayminisitesp.ImgUploadServiceImpl.getCurrentStepRowkeys(ImgUploadServiceImpl.java:260)    at net.shgaoxin.impl.context.eastdayminisitesp.AsyncImgUploadContextImpl.doOnStart(AsyncImgUploadContextImpl.java:81)    at net.shgaoxin.impl.context.AbstractProcessQueue.start(AbstractProcessQueue.java:119)    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    at java.lang.reflect.Method.invoke(Method.java:606)

查看Hadoop源码:

/** fully qualify the path to a binary that should be in a known hadoop    *  bin location. This is primarily useful for disambiguating call-outs    *  to executable sub-components of Hadoop to avoid clashes with other    *  executables that may be in the path.  Caveat:  this call doesn't    *  just format the path to the bin directory.  It also checks for file    *  existence of the composed path. The output of this call should be    *  cached by callers.   * */  public static final String getQualifiedBinPath(String executable)   throws IOException {    // construct hadoop bin path to the specified executable    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"       + File.separator + executable;    File exeFile = new File(fullExeName);    if (!exeFile.exists()) {      throw new IOException("Could not locate executable " + fullExeName        + " in the Hadoop binaries.");    }    return exeFile.getCanonicalPath();  }  private static String HADOOP_HOME_DIR = checkHadoopHome(); /** Centralized logic to discover and validate the sanity of the Hadoop    *  home directory. Returns either NULL or a directory that exists and    *  was specified via either -Dhadoop.home.dir or the HADOOP_HOME ENV    *  variable.  This does a lot of work so it should only be called    *  privately for initialization once per process.   **/  private static String checkHadoopHome() {    // first check the Dflag hadoop.home.dir with JVM scope    String home = System.getProperty("hadoop.home.dir");    // fall back to the system/user-global env variable    if (home == null) {      home = System.getenv("HADOOP_HOME");    }    try {       // couldn't find either setting for hadoop's home directory       if (home == null) {         throw new IOException("HADOOP_HOME or hadoop.home.dir are not set.");       }       if (home.startsWith("\"") && home.endsWith("\"")) {         home = home.substring(1, home.length()-1);       }       // check that the home setting is actually a directory that exists       File homedir = new File(home);       if (!homedir.isAbsolute() || !homedir.exists() || !homedir.isDirectory()) {         throw new IOException("Hadoop home directory " + homedir           + " does not exist, is not a directory, or is not an absolute path.");       }       home = homedir.getCanonicalPath();    } catch (IOException ioe) {      if (LOG.isDebugEnabled()) {        LOG.debug("Failed to detect a valid hadoop home directory", ioe);      }      home = null;    }    return home;  }

结合异常不难发现HADOOP_HOME_DIR值为null,基本上可以判断是 HADOOP_HOME环境变量的问题,实际上本来就没有配置hadoop的环境变量,报错也是理所当然了。

配置windows环境变量需要有winutils.exe,有大神提供了一个windows下环境配置所需文件,
可参考 https://github.com/srccodes/hadoop-common-2.2.0-bin

环境变量配置:

这里写图片描述

需要重启机器才生效。

阅读全文
0 0
原创粉丝点击