<七>、Hadoop Web项目--HDFS文件管理

来源:互联网 发布:淘宝霏慕模特 编辑:程序博客网 时间:2024/06/05 02:49

   本博客参考:http://blog.csdn.net/fansy1990/article/details/51356583

一、项目介绍

   推荐系统的web项目已经完成,现在在此基础上增加HDFS文件管理功能,便于管理HDFS上的文件数据,本文基本参考了fansy1990的HDFS文件管理系统这个项目,改动的地方是将原来分散开的功能集中在了一个页面和处理了中文显示乱码。


二、项目实现

  1、打开菜单显示根目录文件及文件夹,点击文件夹进入下一目录,并可输入文件名、所有者进行检索

  js代码:  

$('#dg_hdfsManager_search').datagrid({border : false,fitColumns : false,singleSelect : true,width : 1050,height : 280,nowrap : false,fit : false,pagination : true,// 分页控件pageSize : 8, // 每页记录数,需要和pageList保持倍数关系pageList : [ 4, 8, 12 ],rownumbers : true,// 行号pagePosition : 'top',url : 'hdfs/hdfsManager_searchFolder.action',queryParams: {folder: "/",name:"",nameOp:"no",owner:"",ownerOp:"no"},onLoadError:function(){console.info("load error!"); $.messager.alert('警告','读取错误,请联系管理员!','warning');},onBeforeLoad:function(param){return checkExistAndAuth(param.folder,'rx');},onLoadSuccess:function(data ){console.info("success,data:"+data);},idField:'id',columns :[[{field : 'name',title : '文件名',width : '120',formatter: function(value,row,index){// 使用转义即可解决单引号、双引号不够用的问题return "<a href='javascript:void(0);' onclick='refreshDir(\""+value+"\",\""+row.type+"\")"+"'>"+value+"</a>";},styler: function(index,row){ var s1="";              if (row.type=='dir'){              return  s1 =  'background-color:#FFCCCC;';              }  }},{field : 'type',title : '类型',width : '50',formatter: function(value,row,index){return "<a href='javascript:void(0);' onclick='refreshDir(\""+row.name+"\",\""+value+"\")"+"'>"+value+"</a>";}},{field : 'size',title : '大小',width : '100'},{field : 'replication',title : '副本数',width : '50',},{field : 'blockSize',title : '块大小',width : '100'},{field : 'modificationTime',title : '修改时间',width : '200'},{field : 'permission',title : '权限',width : '150'},{field : 'owner',title : '所有者',width : '100'},{field : 'group',title : '组名',width : '100'} ]]    });
 java代码:

public void listFolder() throws FileNotFoundException,IllegalArgumentException, IOException {List<HdfsResponseProperties> files = this.hdfsService.listFolder(hdfsFile.getFolder());Map<String, Object> jsonMap = new HashMap<String, Object>();jsonMap.put("total", files.size());jsonMap.put("rows", Utils.getProperFiles(files, page, rows));Utils.write2PrintWriter(JSON.toJSONString(jsonMap));return;}
public List<HdfsResponseProperties> listFolder(String folder)throws FileNotFoundException, IllegalArgumentException, IOException {List<HdfsResponseProperties> files = new ArrayList<>();FileSystem fs = HadoopUtils.getFs();FileStatus[] filesStatus = fs.listStatus(new Path(folder));for (FileStatus file : filesStatus) {files.add(Utils.getDataFromLocatedFileStatus(file));}return files;}
2、新建目录

  在目录输入框录入目录名称,点击新建目录按钮后生成目录并进入改目录

  主要代码  

public boolean createFolder(String folder, boolean recursive)throws IllegalArgumentException, IOException,AccessControlException {FileSystem fs = HadoopUtils.getFs();try {return fs.mkdirs(new Path(folder));} catch (AccessControlException e) {throw e;}}
3、文件上传


  主要代码

public boolean upload(String src, String des) throws Exception {try {HadoopUtils.getFs().copyFromLocalFile(new Path(src), new Path(des));} catch (IllegalArgumentException | IOException e) {log.info("数据上传异常,src:{},des:{}", new Object[] { src, des });throw e;}return true;}
4、文件下载,选中文件所在行,点击下载保存文件到本地

  主要代码

public boolean download(String fileName, String localFile) throws Exception {boolean flag = true;try {HadoopUtils.getFs().copyToLocalFile(new Path(fileName),new Path(localFile));} catch (Exception e) {e.printStackTrace();log.info("数据下载异常,src:{},des:{}",new Object[] { fileName, localFile });throw e;}return flag;}
5、删除,选中文件或文件夹,点击删除进行删除文件或文件夹,文件夹删除模式是删除其中所有文件及目录

 js判断删除对象是文件或目录调用不同方法 

$('#dg_hdfsManager_folder_delete_btn').bind('click', function(){var row=$('#dg_hdfsManager_search').datagrid('getSelected');var  value=row.type;var  name=row.name;var curr =$('#hdfsManager_search_folder').val();if(curr == '/'){var folder_=curr+name;}else{var folder_=curr+'/'+name;}if(value == 'dir'){ var recursive_ ="true";// ajax 异步提交任务var result = callByAJaxHdfs('hdfs/hdfsManager_deleteFolder.action',{folder:folder_,recursive:recursive_});if("true" == result.flag){$.messager.alert('信息','目录删除成功!','info');}else if("false" == result.flag){$.messager.alert('信息','目录删除失败,'+result.msg,'info');}}else if(value != 'dir'){var flag = checkExistAndAuth(folder_,'x');        if(!flag) return;   var file_=folder_;// ajax 异步提交任务var result = callByAJaxHdfs('hdfs/hdfsManager_deleteFile.action',{fileName:file_});if("true" == result.flag){$.messager.alert('信息','文件删除成功!','info');}else if("false" == result.flag){$.messager.alert('信息','文件删除失败,'+result.msg,'info');}}search_data();});
 java代码:

public boolean deleteFolder(String folder, boolean recursive)throws IllegalArgumentException, IOException {FileSystem fs = HadoopUtils.getFs();try {return fs.delete(new Path(folder), recursive);} catch (RemoteException e) {throw e;}}

public boolean deleteFile(String fileName) throws Exception {boolean flag = false;try {flag = HadoopUtils.getFs().delete(new Path(fileName), false);} catch (IllegalArgumentException | IOException e) {log.info("数据删除异常,fileName:{}", new Object[] { fileName });throw e;}return flag;}
6、文件查看,选中文件,如果文件是text文本点击其文件文可以打开查看内容


  主要代码

public static String readText(String fileName, int records)throws IllegalArgumentException, IOException {FileSystem fs = getFs();FSDataInputStream inStream = fs.open(new Path(fileName));BufferedReader br = new BufferedReader(new InputStreamReader(inStream));StringBuffer buffer = new StringBuffer();try {String line;line = br.readLine();while (line != null && records-- > 0) {buffer.append(line).append("<br>");line = br.readLine();}} finally {br.close();inStream.close();}return buffer.toString();}

三、增加HBASE、HIVE表数据管理

 整合HBASE、HIVE表数据管理功能

 hbase继续参考fansy1990 的HBase表管理系统 


 hive是网上看的,需要先启动服务 ./bin/hive --service hiveserver,暂时只做了展示表名称


 代码:

public void getTables() throws Exception {Map<String, Object> jsonMap = new HashMap<String, Object>();int columns = 0;List<HiveTable> alltables = new ArrayList<>();Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");// DriverName 注册Hive驱动       //建立与Hive数据库的连接, 默认端口10000,使用数据库:hive,用户名密码:hive      // URL,USER,PASSWORD      Connection conn = DriverManager.getConnection("jdbc:hive://192.168.128.129:10000/default", "hh", "");      Statement stmt = conn.createStatement();           String sql="show tables";    ResultSet res = stmt.executeQuery(sql);        while (res.next()) {    System.out.println(res.getString(1));    columns++;         HiveTable hivetable = new HiveTable();    BeanUtils.setProperty(hivetable, "tableName", res.getString(1));    alltables.add(hivetable);        }            conn.close();    conn = null;jsonMap.put("total", columns);jsonMap.put("rows", Utils.getProperFiles(alltables, page, rows));Utils.write2PrintWriter(JSON.toJSONString(jsonMap));return;}






  

原创粉丝点击