This wiki page is well outdated as Hadoop 2.x uses Mavenized builds. IntelliJ IDEA can import Maven projects via the root pom.xml file.
Hadoop builds with Ant, but you can set it up to work under IDEA for testing and some iterative development. This does not take away the need to run Ant; you just run it side by side.
build.webapps=build/classes/webapps |
build/webapps
', will not work. StatusHttpServer
locates 'webapps' via
classloader.getResource("webapps") |
Create a new IDEA module for Hadoop.
build/src conf src/ant src/contrib/streaming/src/java src/core src/examples src/hdfs src/mapred src/native/src src/tools |
build/
goes away on a clean build, and needs to be picked up again by resynchronizing IDEA (if it is not automatic)build/webapps
, which is not the right place to be picked up by the IDE. Moving it under build/resources/
is needed to place it somewhere manageable.build/src
is required for compiled jsp files. Unfortunately, there is no separated ant task to regenerate them. The best is running ant command line.conf
is required for hadoop-default.xml
to be copied to build/classes
. Configuration
will load hadoop-default.xml
as a resource via classloader.
src/test build/test/src |
build/
goes away on a clean build. You can re-create it by running 'generate-test-records' ant task.
Set these to the full path of where Hadoop's Ant build sticks things, such as :
/home/user/hadoop-core/build/classes
/home/user/hadoop-core/build/test/classes
lib/
directory.lib/
. Do keep in sync with library versions, especially that of Jetty.src/test/lib/
directory.
To run JUnit tests under the IDE, create a new test configuration pointing to the chosen tests.
-Dhadoop.log.dir=/home/user/hadoop-core/build/test/logs |
JRockit users: consider editing conf/log4j.properties
to
log4j.appender.console.layout.ConversionPattern=%-4r %-5p %c %x - %m%n |
This may seem odd, but it eliminated deadlocks in the logging.