C访问hadoop程序终端显示运行正确,因为连接参数错误,使得通过网页查看就是没有成功原因分析和解决方案

来源:互联网 发布:国际歌在中国禁止知乎 编辑:程序博客网 时间:2024/06/10 02:34

代码:


#include "/usr/local/hadoop/src/c++/libhdfs/hdfs.h"int main(int argc, char **argv) {if (argc != 2) {fprintf(stderr, "Usage: hdfs_write \n");exit(-1);}         hdfsFS fs = hdfsConnect("default", 0);if (!fs) {fprintf(stderr, "Oops! Failed to connect to hdfs!\n");exit(-1);}const char* readFileName = argv[1];//tSize bufferSize = strtoul(argv[2], NULL, 10);tSize bufferSize = 100;hdfsFile readFile = hdfsOpenFile(fs, readFileName, O_RDONLY, bufferSize,3, 100);if (!readFile) {fprintf(stderr, "Failed to open %s for reading!\n", readFileName);exit(-2);}printf("readFile = %d\n", readFile);// data to be written to the filechar* buffer = malloc(sizeof(char) * bufferSize);if (buffer == NULL) {printf("malloc erro\n");return -2;}// write to the file//tSize nrRemaining;//for (nrRemaining = fileTotalSize; nrRemaining > 0; nrRemaining -= bufferSize) {//int curSize = (bufferSize < nrRemaining) ? bufferSize : (int) nrRemaining;memset(buffer, 0, bufferSize);int getNum = hdfsRead(fs, readFile, (void*) buffer, 20);printf("读到%d个字符\n", getNum);buffer[sizeof(buffer) + 1] = '\0';printf("读到的内容为:%s*\n", buffer);//}free(buffer);hdfsCloseFile(fs, readFile);hdfsDisconnect (fs);return 0;}



通过几经折腾终于编译成功,可是运行后查看网页http://localhost:50075/browseDirectory.jsp,

老是没有效果。

hadoop@springwater-Aspire-4743:/usr/local/hadoop/test$ ./test /my/t.txt readFile = 166133208读到0个字符读到的内容为:*
后来去查看了hdfsOpenFile的参数说明,发现:

hdfsFS fs = hdfsConnect("default", 0);


默认是连接本地文件系统。

想要连接分布式系统,还得知名host和port


改为:

hdfsFS fs = hdfsConnect("localhost", 9000);
就ok了。

hadoop@springwater-Aspire-4743:/usr/local/hadoop/test$ ./test /my/t.txt readFile = 167189144读到20个字符读到的内容为:郭��*


哈哈哈哈,终于java和c的实例程序搞定了。sooooooooooo happy!