c语言sscanf函数的用法是什么
261
2022-11-19
安装hue及hadoop和hive整合
环境:
centos7
jdk1.8.0_111
Hadoop 2.7.3
Hive1.2.2
hue-3.10.0
Hue安装:
1.下载hue-3.10.0.tgz:
install libffi-develyum install gmp-develyum install python-devel mysql-develyum install ant gcc gcc-c++ rsync krb5-devel mysql openssl-devel cyrus-sasl-devel cyrus-sasl-gssapi sqlite-devel openldap-devel python-simplejsonyum install libtidy libxml2-devel libxslt-develyum install python-devel python-simplejson python-setuptoolsyum install maven
3.编译Hue
tar -xzvf hue-3.10.0.tgzcd hue-3.10.0make appsmake install
Hue整合Hadoop:
HDFS:
hdfs-site.xml文件配置:
core-site.xml文件配置:
需要把修改过的hdfs-site.xml,core-site.xml文件分发到其他子节点上
vi hue-3.10.0/desktop/conf/hue.ini
[hadoop] [[hdfs_clusters]] [[[default]]] # Enter the filesystem uri fs_defaultfs=hdfs://localhost:8020 # Use WebHdfs/HttpFs as the communication mechanism. # Domain should be the NameNode or HttpFs host. webhdfs_url=hue-3.10.0/desktop/conf/hue.ini
[hadoop] [[yarn_clusters]] [[[default]]] # Enter the host on which you are running the ResourceManager resourcemanager_host=localhost # Whether to submit jobs to this cluster submit_to=True # URL of the ResourceManager API resourcemanager_api_url= # URL of the ProxyServer API proxy_api_url= # URL of the HistoryServer API history_server_api_url=hue-3.10.0/desktop/conf/hue.ini
[beeswax] # Host where HiveServer2 is running. hive_server_host=localhost # Hive configuration directory, where hive-site.xml is located hive_conf_dir=/etc/hive/conf
修改hive-site.xml配置:
启动hive服务
$ bin/hive --service metastore
高能预警:matestore服务是Hive连接MySQL的metastore数据库用的。
$ bin/hive --service hiveserver2
高能预警:hiveserver2服务是通过JDBC访问Hive用的,默认端口是:10000。
启动Hue
build/env/bin/supervisor
在浏览器访问:界面并登陆
参考:
http://opexlabs.com/2016/07/20/compiling-hue-centos-7/
http://gethue.com/how-to-configure-hue-in-your-hadoop-cluster/
版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。
发表评论
暂时没有评论,来抢沙发吧~