1.安装eclipse3.6.2, 废止3.7, 这个有很多问题
2.安装eclipse插件ivy
You can install Apache IvyDE plugins from the IvyDE update site: .
First you have to configure Eclipse: add the IvyDE update site. In order to do it, follow these steps (Note that for Eclipse 3.4 it may defers):- Open the update manager in Eclipse: Help > Software Updates > Find and Install...
- In the popup window, select Search for features to install, and click Next
- Then click on New Remote Site...
- Name: Apache Ivy update site
- URL:
- Click OK
A new entry "Apache Ivy update site" will appear in the list of update sites
3.下面引用另出博客的安装,有关注可以去下面链接出看看
-------------------start ref-----
1.修改$HADOOP_HOME/src/contrib/build-contrib.xml
增加一行:<property name="eclipse.home" location="/home/gushui/eclipse"/> 上句后面的/home/gushui/eclipse由自己的$ECLIPSE_HOME代替 2.修改 $HADOOP_HOME/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java 注释掉原来的//import org.eclipse.jdt.internal.debug.ui.launcher.JavaApplicationLaunchShortcut; 改为import org.eclipse.jdt.debug.ui.launchConfigurations.JavaApplicationLaunchShortcut; 3.执行:- cd $HADOOP_HOME
- ant compile
- ln -sf $HADOOP_HOME/docs $HADOOP_HOME/build/docs
- ant package -Djava5.home=/usr/lib/jvm/java-1.5.0-sun-1.5.0.19 -Dforrest.home=/home/gushui/src/apache-forrest-0.8
注:安装apache-forrest-0.8:,放在 /home/gushui/src/apache-forrest-0.8)
注: 这里我用的jdk1.5.0.22和apache-forrest-0.9
注意上面的java5路径和apache-forrest路径要根据你自己的安装路径而设定 ok,应该会在$HADOOP_HOME/build/contrib/eclipse-plugin/hadoop-0.20.3-dev-eclipse-plugin.jar 修改名字为hadoop-0.20.2-eclipse-plugin.jar,搞定。至于为什么要修改我也不太清楚,版本本身是0.20.2的,它却跳出来0.20.3。注:我编的是1.0.2, 如果是0.2.203需要更改上面文件,否则编出来的eclipse plugin链接不上dfs server. 5.注意几点: (1)把这个jar包放到eclipse 的plugins目录下。重启eclipse。我的貌似不行,用了最笨的办法,把eclipse删掉再重新解压tar文件重装,后面可以了(2)zz,我的也是这样:如果你的eclipse 的 run as -> run on hadoop 功能按了還是沒有反應的話,請先執行 run as -> java application ,再 run as -> run on hadoop 就可以了
-----------------end ref
4.我执行以上步骤遇到的问题:
a。Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory。。。。。。。。
这个通过更新这个tool解决
"sudo apt-get install automake autoconf"
b。又一个break,如我的格言,我们总是不那么幸运
[exec] * [15/35] [0/0] 0.086s 0b hdfs_user_guide.pdf
[exec] Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/fop/messaging/MessageHandler [exec] at org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:122) [exec] at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) [exec] at org.apache.avalon.excalibur.component.DefaultComponentFactory.newInstance(DefaultComponentFactory.java:289)解决方法:ant clean 重复前面过程
task-controller:
[exec] Can't exec "libtoolize": No such file or directory at /usr/bin/autoreconf line 196.解决方法: sudo apt-get install libtool
[exec] /usr/include/fts.h:41:3: error: #error "<fts.h> cannot be used with -D_FILE_OFFSET_BITS==64"
[exec] make: *** [impl/task-controller.o] Error 1 解决方法:这个终于找到答案,这是一个fix已经提供,see
简单说就是controler不用大文件操作,可以把AC_SYS_LARGEFILE去掉, 步骤:
1.找到文件/$HADOOP_HOME/src/c++/task-controller/configure.ac
2.找到行AC_SYS_LARGEFILE, 注释掉
重新编译ant package步骤,oh,yeal,通过了!!!!