-
Couldn't load subscription status.
- Fork 773
Update code to support newer java versions #586
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Changes from all commits
18b4234
40dfcce
4eb1191
e5bedc0
27085de
0e48596
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,6 +1,9 @@ | ||
| dist: trusty | ||
| sudo: required | ||
| language: java | ||
| jdk: | ||
| - openjdk11 | ||
| - openjdk8 | ||
| - openjdk7 | ||
| before_install: | ||
| - cat /etc/hosts # optionally check the content *before* | ||
|
|
@@ -10,32 +13,53 @@ before_install: | |
| - cat /proc/cpuinfo | grep cores | wc -l | ||
| - free -h | ||
| install: | ||
| - hibench=$(pwd) | ||
| - cd /opt/ | ||
| - wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz | ||
| - tar -xzf spark-1.6.0-bin-hadoop2.6.tgz | ||
| - wget https://archive.apache.org/dist/hadoop/core/hadoop-2.6.5/hadoop-2.6.5.tar.gz | ||
| - tar -xzf hadoop-2.6.5.tar.gz | ||
| - cd ${hibench} | ||
| - cp ./travis/spark-env.sh /opt/spark-1.6.0-bin-hadoop2.6/conf/ | ||
| - cp ./travis/core-site.xml /opt/hadoop-2.6.5/etc/hadoop/ | ||
| - cp ./travis/hdfs-site.xml /opt/hadoop-2.6.5/etc/hadoop/ | ||
| - cp ./travis/mapred-site.xml /opt/hadoop-2.6.5/etc/hadoop/ | ||
| - cp ./travis/yarn-site.xml /opt/hadoop-2.6.5/etc/hadoop/ | ||
| - cp ./travis/hibench.conf ./conf/ | ||
| - cp ./travis/benchmarks.lst ./conf/ | ||
| - | | ||
| export java_ver=$(./travis/jdk_ver.sh) | ||
| if [[ "$java_ver" == 11 ]]; then | ||
| export HADOOP_VER=3.2.0 | ||
| export SPARK_VER=2.4.3 | ||
| export SPARK_PACKAGE_TYPE=without-hadoop-scala-2.12 | ||
| elif [[ "$java_ver" == 8 ]]; then | ||
| export HADOOP_VER=3.2.0 | ||
| export SPARK_VER=2.4.3 | ||
| export SPARK_PACKAGE_TYPE=without-hadoop | ||
| elif [[ "$java_ver" == 7 ]]; then | ||
| export HADOOP_VER=2.6.5 | ||
| export SPARK_VER=1.6.0 | ||
| export SPARK_PACKAGE_TYPE=hadoop2.6 | ||
| else | ||
| exit 1 | ||
| fi | ||
|
|
||
| # Folders where are stored Spark and Hadoop depending on version required | ||
| export SPARK_BINARIES_FOLDER=spark-$SPARK_VER-bin-$SPARK_PACKAGE_TYPE | ||
| export HADOOP_BINARIES_FOLDER=hadoop-$HADOOP_VER | ||
| export HADOOP_CONF_DIR=/opt/$HADOOP_BINARIES_FOLDER/etc/hadoop/ | ||
| export HADOOP_HOME=/opt/$HADOOP_BINARIES_FOLDER | ||
|
|
||
| sudo -E ./travis/install_hadoop_spark.sh | ||
| sudo -E ./travis/config_hadoop_spark.sh | ||
luisfponce marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| before_script: | ||
| - "export JAVA_OPTS=-Xmx512m" | ||
| cache: | ||
| directories: | ||
| - $HOME/.m2 | ||
| script: | ||
| - mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11 | ||
| - mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11 | ||
| - mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10 | ||
| - sudo -E ./travis/configssh.sh | ||
| - sudo -E ./travis/restart_hadoop_spark.sh | ||
| - cp ./travis/hadoop.conf ./conf/ | ||
| - cp ./travis/spark.conf ./conf/ | ||
| - /opt/hadoop-2.6.5/bin/yarn node -list 2 | ||
| - sudo -E ./bin/run_all.sh | ||
luisfponce marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| - | | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. :nit remove this There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I used this pipe (literal style) because since my perspective looks cleaner when putting script in yaml files, avoiding writting Other way it would look like this example: instead of currently it is: Up to you, for me both ways still working, (and Mr. Yaml lint indicates both ways works too) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks, but seems without any pipes is also valid, the |
||
| if [[ "$java_ver" == 11 ]]; then | ||
| mvn clean package -Psparkbench -Phadoopbench -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.12 -Dmaven-compiler-plugin.version=3.8.0 -Dexclude-streaming | ||
| elif [[ "$java_ver" == 8 ]]; then | ||
| mvn clean package -q -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.11 | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Curious about this, even if we don't run SQL module tests for Spark2.4, how did the compiling work... |
||
| sudo -E ./travis/configssh.sh | ||
| sudo -E ./travis/restart_hadoop_spark.sh | ||
| sudo -E ./bin/run_all.sh | ||
| elif [[ "$java_ver" == 7 ]]; then | ||
| mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11 | ||
| mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11 | ||
| mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10 | ||
| sudo -E ./travis/configssh.sh | ||
| sudo -E ./travis/restart_hadoop_spark.sh | ||
| sudo -E ./bin/run_all.sh | ||
| else | ||
| exit 1 | ||
| fi | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -28,7 +28,7 @@ Because some Maven plugins cannot support Scala version perfectly, there are som | |
|
|
||
|
|
||
| ### Specify Spark Version ### | ||
| To specify the spark version, use -Dspark=xxx(1.6, 2.0, 2.1 or 2.2). By default, it builds for spark 2.0 | ||
| To specify the spark version, use -Dspark=xxx(1.6, 2.0, 2.1, 2.2 or 2.4). By default, it builds for spark 2.0 | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Actually this Spark 2.4 support doesn't include |
||
|
|
||
| mvn -Psparkbench -Dspark=1.6 -Dscala=2.11 clean package | ||
| tips: | ||
|
|
@@ -37,6 +37,11 @@ default . For example , if we want use spark2.0 and scala2.11 to build hibench. | |
| package` , but for spark2.0 and scala2.10 , we need use the command `mvn -Dspark=2.0 -Dscala=2.10 clean package` . | ||
| Similarly , the spark1.6 is associated with the scala2.10 by default. | ||
|
|
||
| ### Specify Hadoop Version ### | ||
| To specify the spark version, use -Dhadoop=xxx(3.2). By default, it builds for hadoop 2.4 | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: spark version -> hadoop version |
||
|
|
||
| mvn -Psparkbench -Dhadoop=3.2 -Dspark=2.4 clean package | ||
|
|
||
| ### Build a single module ### | ||
| If you are only interested in a single workload in HiBench. You can build a single module. For example, the below command only builds the SQL workloads for Spark. | ||
|
|
||
|
|
@@ -48,3 +53,13 @@ Supported modules includes: micro, ml(machine learning), sql, websearch, graph, | |
| For Spark 2.0 and Spark 2.1, we add the benchmark support for Structured Streaming. This is a new module which cannot be compiled in Spark 1.6. And it won't get compiled by default even if you specify the spark version as 2.0 or 2.1. You must explicitly specify it like this: | ||
|
|
||
| mvn -Psparkbench -Dmodules -PstructuredStreaming clean package | ||
|
|
||
| ### Build using JDK 1.11 | ||
| **For Java 11 it is suitable to be built for Spark 2.4 _(Compiled with Scala 2.12)_ and/or Hadoop 3.2 only** | ||
|
|
||
| If you are interested in building using Java 11 indicate that streaming benchmarks won't be compiled, and specify scala, spark, hadoop and maven compiler version as below | ||
|
|
||
| mvn clean package -Psparkbench -Phadoopbench -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.12 -Dexclude-streaming -Dmaven-compiler-plugin.version=3.8.0 | ||
|
|
||
| Supported frameworks only: hadoopbench, sparkbench (Does not support flinkbench, stormbench, gearpumpbench) | ||
| Supported modules includes: micro, ml(machine learning), websearch and graph (does not support streaming, structuredStreaming and sql) | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -159,7 +159,43 @@ | |
| </dependencies> | ||
| <activation> | ||
| <property> | ||
| <name>!modules</name> | ||
| <name>!exclude-streaming</name> | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If a user specifies modules=xxx and doesn't specify exclude-streaming, this allModules will be activated, which is not expected. |
||
| </property> | ||
| </activation> | ||
| </profile> | ||
|
|
||
| <profile> | ||
| <id>exclude-streaming</id> | ||
| <dependencies> | ||
| <dependency> | ||
| <groupId>com.intel.hibench.sparkbench</groupId> | ||
| <artifactId>sparkbench-micro</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>com.intel.hibench.sparkbench</groupId> | ||
| <artifactId>sparkbench-ml</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>com.intel.hibench.sparkbench</groupId> | ||
| <artifactId>sparkbench-websearch</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>com.intel.hibench.sparkbench</groupId> | ||
| <artifactId>sparkbench-graph</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> | ||
| <dependency> | ||
| <groupId>com.intel.hibench.sparkbench</groupId> | ||
| <artifactId>sparkbench-sql</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> | ||
| </dependencies> | ||
| <activation> | ||
| <property> | ||
| <name>exclude-streaming</name> | ||
| </property> | ||
| </activation> | ||
| </profile> | ||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.