Skip to content

Commit

Permalink
[feat] upgrade hadoop\spark\hive default vertion to 3.x (#4263)
Browse files Browse the repository at this point in the history
* upgrade hive\spark\hadoop to default 3.x
  • Loading branch information
GuoPhilipse authored Mar 6, 2023
1 parent f1deaca commit eb55412
Show file tree
Hide file tree
Showing 40 changed files with 270 additions and 227 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,8 +85,8 @@ Since the first release of Linkis in 2019, it has accumulated more than **700**

| **Engine Name** | **Suppor Component Version<br/>(Default Dependent Version)** | **Linkis Version Requirements** | **Included in Release Package<br/> By Default** | **Description** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 2.4.3)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 2.3.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 3.2.1)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 3.1.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Python|Python >= 2.6, <br/>(default Python2*)|\>=1.0.3|Yes |Python EngineConn, supports python code|
|Shell|Bash >= 2.0|\>=1.0.3|Yes|Shell EngineConn, supports Bash shell code|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(default Hive-jdbc 2.3.4)|\>=1.0.3|No|JDBC EngineConn, already supports MySQL and HiveQL, can be extended quickly Support other engines with JDBC Driver package, such as Oracle|
Expand Down
4 changes: 2 additions & 2 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,8 +82,8 @@ Linkis 自 2019 年开源发布以来,已累计积累了 700 多家试验企

| **引擎名** | **支持底层组件版本 <br/>(默认依赖版本)** | **Linkis 版本要求** | **是否默认包含在发布包中** | **说明** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 2.4.3|\>=1.0.3||Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 2.3.3)|\>=1.0.3||Hive EngineConn, 支持 HiveQL 代码|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 3.2.1|\>=1.0.3||Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 3.1.3)|\>=1.0.3||Hive EngineConn, 支持 HiveQL 代码|
|Python|Python >= 2.6, <br/>(默认 Python2*|\>=1.0.3||Python EngineConn, 支持 python 代码|
|Shell|Bash >= 2.0|\>=1.0.3||Shell EngineConn, 支持 Bash shell 代码|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(默认 Hive-jdbc 2.3.4)|\>=1.0.3||JDBC EngineConn, 已支持 MySQL 和 HiveQL,可快速扩展支持其他有 JDBC Driver 包的引擎, 如 Oracle|
Expand Down
4 changes: 2 additions & 2 deletions docs/configuration/linkis-computation-governance-common.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |
| -------- | -------- | ----- |----- |
|linkis-computation-governance-common|wds.linkis.rm| | wds.linkis.rm |
|linkis-computation-governance-common|wds.linkis.spark.engine.version|2.4.3 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 1.2.1 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.spark.engine.version|3.2.1 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 3.1.3 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.python.engine.version|python2 | python.engine.version |
|linkis-computation-governance-common|wds.linkis.python.code_parser.enabled| false |python.code_parser.enabled|
|linkis-computation-governance-common|wds.linkis.scala.code_parser.enabled| false | scala.code_parser.enabled |
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/linkis-manager-common.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-manager-common|wds.linkis.default.engine.type |spark|engine.type|
|linkis-manager-common|wds.linkis.default.engine.version |2.4.3|engine.version|
|linkis-manager-common|wds.linkis.default.engine.version |3.2.1|engine.version|
|linkis-manager-common|wds.linkis.manager.admin|hadoop|manager.admin|
|linkis-manager-common|wds.linkis.rm.application.name|ResourceManager|rm.application.name|
|linkis-manager-common|wds.linkis.rm.wait.event.time.out| 1000 * 60 * 12L |event.time.out|
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/linkis-udf.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-1.2.1.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-3.1.3.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.tmp.path|/tmp/udf/|udf.tmp.path|
|linkis-udf|wds.linkis.udf.share.path|/mnt/bdap/udf/|udf.share.path|
|linkis-udf|wds.linkis.udf.share.proxy.user| hadoop|udf.share.proxy.user|
Expand Down
2 changes: 1 addition & 1 deletion docs/errorcode/linkis-configuration-errorcode.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
|linkis-configuration |14100|CategoryName cannot be included '-'(类别名称不能包含 '-')|CANNOT_BE_INCLUDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Creator is null, cannot be added(创建者为空,无法添加)|CREATOR_IS_NULL_CANNOT_BE_ADDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Engine type is null, cannot be added(引擎类型为空,无法添加)|ENGINE_TYPE_IS_NULL|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-2.4.3(保存的引擎类型参数有误,请按照固定格式传送,例如spark-2.4.3)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-3.2.1(保存的引擎类型参数有误,请按照固定格式传送,例如spark-3.2.1)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Incomplete request parameters, please reconfirm(请求参数不完整,请重新确认)|INCOMPLETE_RECONFIRM|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Only admin can modify category(只有管理员才能修改目录)|ONLY_ADMIN_CAN_MODIFY|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The label parameter is empty(标签参数为空)|THE_LABEL_PARAMETER_IS_EMPTY|LinkisConfigurationErrorCodeSummary|
Expand Down
2 changes: 1 addition & 1 deletion docs/trino-usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入

```
linkis_ps_configuration_config_key: 插入引擎的配置参数的key和默认values
linkis_cg_manager_label:插入引擎label如:hive-2.3.3
linkis_cg_manager_label:插入引擎label如:hive-3.1.3
linkis_ps_configuration_category: 插入引擎的目录关联关系
linkis_ps_configuration_config_value: 插入引擎需要展示的配置
linkis_ps_configuration_key_engine_relation:配置项和引擎的关联关系
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ public class UJESConstants {
public static final String QUERY_PAGE_SIZE_NAME = "pageSize";
public static final int QUERY_PAGE_SIZE_DEFAULT_VALUE = 100;

public static final Long DRIVER_QUERY_SLEEP_MILLS = 500l;
public static final Long DRIVER_QUERY_SLEEP_MILLS = 500L;
public static final Integer DRIVER_REQUEST_MAX_RETRY_TIME = 3;

public static final String QUERY_STATUS_NAME = "status";
Expand All @@ -40,7 +40,4 @@ public class UJESConstants {
public static final Integer IDX_FOR_LOG_TYPE_ALL = 3; // 0: Error 1: WARN 2:INFO 3: ALL

public static final int DEFAULT_PAGE_SIZE = 500;

public static final String DEFAULT_SPARK_ENGINE = "spark-2.4.3";
public static final String DEFAULT_HIVE_ENGINE = "hive-1.2.1";
}
Original file line number Diff line number Diff line change
Expand Up @@ -85,12 +85,12 @@ public void before() {

/* Test different task type */

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "sql",
// "-code", "show tables;show tables;show tables",

//
// "-engineType", "hive-1.2.1",
// "-engineType", "hive-3.1.3",
// "-codeType", "sql",
// "-code", "show tables;",

Expand All @@ -101,11 +101,11 @@ public void before() {
"-code",
"whoami",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "py",
// "-code", "print ('hello')",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "scala",
// "-codePath", "src/test/resources/testScala.scala",

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.linkis.computation.client;

import org.apache.linkis.computation.client.interactive.SubmittableInteractiveJob;
import org.apache.linkis.manager.label.conf.LabelCommonConfig;

/** A test class for submit a sql to hive engineConn. */
public class InteractiveJobTest {
Expand All @@ -29,7 +30,7 @@ public static void main(String[] args) {
SubmittableInteractiveJob job =
LinkisJobClient.interactive()
.builder()
.setEngineType("hive-2.3.3")
.setEngineType("hive-" + LabelCommonConfig.HIVE_ENGINE_VERSION.getValue())
.setRunTypeStr("sql")
.setCreator("IDE")
.setCode("show tables")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,17 @@
package org.apache.linkis.governance.common.conf

import org.apache.linkis.common.conf.{CommonVars, Configuration}
import org.apache.linkis.manager.label.conf.LabelCommonConfig

object GovernanceCommonConf {

val CONF_FILTER_RM = "wds.linkis.rm"

val SPARK_ENGINE_VERSION = CommonVars("wds.linkis.spark.engine.version", "2.4.3")
val SPARK_ENGINE_VERSION =
CommonVars("wds.linkis.spark.engine.version", LabelCommonConfig.SPARK_ENGINE_VERSION.getValue)

val HIVE_ENGINE_VERSION = CommonVars("wds.linkis.hive.engine.version", "1.2.1")
val HIVE_ENGINE_VERSION =
CommonVars("wds.linkis.hive.engine.version", LabelCommonConfig.HIVE_ENGINE_VERSION.getValue)

val PYTHON_ENGINE_VERSION = CommonVars("wds.linkis.python.engine.version", "python2")

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,8 @@ class GovernanceCommonConfTest {
val errorcodedesclen = GovernanceCommonConf.ERROR_CODE_DESC_LEN

Assertions.assertEquals("wds.linkis.rm", conffilterrm)
Assertions.assertEquals("2.4.3", sparkengineversion)
Assertions.assertEquals("1.2.1", hiveengineversion)
Assertions.assertEquals("3.2.1", sparkengineversion)
Assertions.assertEquals("3.1.3", hiveengineversion)
Assertions.assertEquals("python2", pythonengineversion)
Assertions.assertFalse(pythoncodeparserswitch)
Assertions.assertFalse(scalacodeparserswitch)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ import org.apache.linkis.manager.label.builder.factory.{
LabelBuilderFactory,
LabelBuilderFactoryContext
}
import org.apache.linkis.manager.label.conf.LabelCommonConfig
import org.apache.linkis.manager.label.constant.LabelKeyConstant
import org.apache.linkis.manager.label.entity.Label
import org.apache.linkis.manager.label.entity.engine.{CodeLanguageLabel, UserCreatorLabel}
Expand Down Expand Up @@ -134,7 +135,8 @@ class CommonEntranceParser(val persistenceManager: PersistenceManager)
private def checkEngineTypeLabel(labels: util.Map[String, Label[_]]): Unit = {
val engineTypeLabel = labels.getOrDefault(LabelKeyConstant.ENGINE_TYPE_KEY, null)
if (null == engineTypeLabel) {
val msg = s"You need to specify engineTypeLabel in labels, such as spark-2.4.3"
val msg = s"You need to specify engineTypeLabel in labels," +
s"such as spark-${LabelCommonConfig.SPARK_ENGINE_VERSION.getValue}"
throw new EntranceIllegalParamException(
EntranceErrorCode.LABEL_PARAMS_INVALID.getErrCode,
EntranceErrorCode.LABEL_PARAMS_INVALID.getDesc + msg
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,10 @@ public class LabelCommonConfig {
CommonVars.apply("wds.linkis.label.entity.packages", "");

public static final CommonVars<String> SPARK_ENGINE_VERSION =
CommonVars.apply("wds.linkis.spark.engine.version", "2.4.3");
CommonVars.apply("wds.linkis.spark.engine.version", "3.2.1");

public static final CommonVars<String> HIVE_ENGINE_VERSION =
CommonVars.apply("wds.linkis.hive.engine.version", "2.3.3");
CommonVars.apply("wds.linkis.hive.engine.version", "3.1.3");

public static final CommonVars<String> PYTHON_ENGINE_VERSION =
CommonVars.apply("wds.linkis.python.engine.version", "python2");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

import org.apache.linkis.manager.label.builder.factory.LabelBuilderFactory;
import org.apache.linkis.manager.label.builder.factory.LabelBuilderFactoryContext;
import org.apache.linkis.manager.label.conf.LabelCommonConfig;
import org.apache.linkis.manager.label.entity.Label;
import org.apache.linkis.manager.label.entity.node.AliasServiceInstanceLabel;
import org.apache.linkis.manager.label.exception.LabelErrorException;
Expand All @@ -27,7 +28,9 @@ public class TestLabelBuilder {

public static void main(String[] args) throws LabelErrorException {
LabelBuilderFactory labelBuilderFactory = LabelBuilderFactoryContext.getLabelBuilderFactory();
Label<?> engineType = labelBuilderFactory.createLabel("engineType", "hive-1.2.1");
Label<?> engineType =
labelBuilderFactory.createLabel(
"engineType", "hive-" + LabelCommonConfig.HIVE_ENGINE_VERSION.getValue());
System.out.println(engineType.getFeature());

AliasServiceInstanceLabel emInstanceLabel =
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,16 @@
package org.apache.linkis.manager.common.conf

import org.apache.linkis.common.conf.CommonVars
import org.apache.linkis.manager.label.conf.LabelCommonConfig

object ManagerCommonConf {

val DEFAULT_ENGINE_TYPE = CommonVars("wds.linkis.default.engine.type", "spark")

val DEFAULT_ENGINE_VERSION = CommonVars("wds.linkis.default.engine.version", "2.4.3")
val DEFAULT_ENGINE_VERSION = CommonVars(
"wds.linkis.default.engine.version",
LabelCommonConfig.SPARK_ENGINE_VERSION.defaultValue
)

val DEFAULT_ADMIN = CommonVars("wds.linkis.manager.admin", "hadoop")

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@
<where>
<if test="instance != null"> service_instance = #{instance}</if>
<if test="username != null"> and create_user = #{username}</if>
<!-- label_value in db eg:`hadoop-spark,spark-2.4.3`-->
<!-- label_value in db eg:`hadoop-spark,spark-3.2.1`-->
<if test="engineType !=null">and label_value like concat('%,',#{engineType},'%')</if>
<if test="startDate != null">and create_time BETWEEN #{startDate} AND #{endDate}</if>
</where>
Expand All @@ -93,7 +93,7 @@
</if>

<if test="engineTypes != null and engineTypes.size() > 0">
<!-- label_value in db eg:`hadoop-spark,spark-2.4.3`-->
<!-- label_value in db eg:`hadoop-spark,spark-3.2.1`-->
and SUBSTRING_INDEX(SUBSTRING_INDEX(ecr.label_value,',',-1),"-",1) in
<foreach collection="engineTypes" item="i" open="(" close=")" separator=",">
#{i}
Expand Down
2 changes: 1 addition & 1 deletion linkis-dist/bin/checkEnv.sh
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ function checkPythonAndJava(){

function checkHdfs(){
hadoopVersion="`hdfs version`"
defaultHadoopVersion="2.7"
defaultHadoopVersion="3.3"
checkversion "$hadoopVersion" $defaultHadoopVersion hadoop
}

Expand Down
4 changes: 2 additions & 2 deletions linkis-dist/bin/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -219,13 +219,13 @@ SERVER_IP=$local_host
##Label set start
if [ "$SPARK_VERSION" != "" ]
then
sed -i ${txt} "s#spark-2.4.3#spark-$SPARK_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#spark-3.2.1#spark-$SPARK_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#\#wds.linkis.spark.engine.version.*#wds.linkis.spark.engine.version=$SPARK_VERSION#g" $common_conf
fi

if [ "$HIVE_VERSION" != "" ]
then
sed -i ${txt} "s#hive-2.3.3#hive-$HIVE_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#hive-3.1.3#hive-$HIVE_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#\#wds.linkis.hive.engine.version.*#wds.linkis.hive.engine.version=$HIVE_VERSION#g" $common_conf
fi

Expand Down
6 changes: 3 additions & 3 deletions linkis-dist/deploy-config/linkis-env.sh
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/appcom/config/hadoop-config"}
HADOOP_KERBEROS_ENABLE=${HADOOP_KERBEROS_ENABLE:-"false"}
HADOOP_KEYTAB_PATH=${HADOOP_KEYTAB_PATH:-"/appcom/keytab/"}
## Hadoop env version
HADOOP_VERSION=${HADOOP_VERSION:-"2.7.2"}
HADOOP_VERSION=${HADOOP_VERSION:-"3.3.4"}

#Hive
HIVE_HOME=/appcom/Install/hive
Expand All @@ -91,10 +91,10 @@ SPARK_CONF_DIR=/appcom/config/spark-config

## Engine version conf
#SPARK_VERSION
#SPARK_VERSION=2.4.3
#SPARK_VERSION=3.2.1

##HIVE_VERSION
#HIVE_VERSION=2.3.3
#HIVE_VERSION=3.1.3

#PYTHON_VERSION=python2

Expand Down
8 changes: 4 additions & 4 deletions linkis-dist/docker/ldh.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,10 @@ ARG JDK_VERSION=1.8.0-openjdk
ARG JDK_BUILD_REVISION=1.8.0.332.b09-1.el7_9
ARG MYSQL_JDBC_VERSION=8.0.28

ARG HADOOP_VERSION=2.7.2
ARG HIVE_VERSION=2.3.3
ARG SPARK_VERSION=2.4.3
ARG SPARK_HADOOP_VERSION=2.7
ARG HADOOP_VERSION=3.3.4
ARG HIVE_VERSION=3.1.3
ARG SPARK_VERSION=3.2.1
ARG SPARK_HADOOP_VERSION=3.2
ARG FLINK_VERSION=1.12.2
ARG ZOOKEEPER_VERSION=3.5.9

Expand Down
8 changes: 4 additions & 4 deletions linkis-dist/docker/scripts/prepare-ldh-image.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,10 @@ rm -rf ${LDH_TAR_DIR} && mkdir -p ${LDH_TAR_DIR}
rm -rf ${PROJECT_TARGET}/entry-point-ldh.sh
cp ${WORK_DIR}/entry-point-ldh.sh ${PROJECT_TARGET}/

HADOOP_VERSION=${HADOOP_VERSION:-2.7.2}
HIVE_VERSION=${HIVE_VERSION:-2.3.3}
SPARK_VERSION=${SPARK_VERSION:-2.4.3}
SPARK_HADOOP_VERSION=${SPARK_HADOOP_VERSION:-2.7}
HADOOP_VERSION=${HADOOP_VERSION:-3.3.4}
HIVE_VERSION=${HIVE_VERSION:-3.1.3}
SPARK_VERSION=${SPARK_VERSION:-3.2.1}
SPARK_HADOOP_VERSION=${SPARK_HADOOP_VERSION:-3.2}
FLINK_VERSION=${FLINK_VERSION:-1.12.2}
ZOOKEEPER_VERSION=${ZOOKEEPER_VERSION:-3.5.9}
MYSQL_JDBC_VERSION=${MYSQL_JDBC_VERSION:-8.0.28}
Expand Down
14 changes: 7 additions & 7 deletions linkis-dist/helm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -201,9 +201,9 @@ $> kind delete cluster --name test-helm

We introduced a new image, called LDH (Linkis's hadoop all-in-one image), which provides a pseudo-distributed hadoop cluster for testing quickly. This image contains the following hadoop components, the default mode for engines in LDH is on-yarn.

* Hadoop 2.7.2 , including HDFS and YARN
* Hive 2.3.3
* Spark 2.4.3
* Hadoop 3.3.4 , including HDFS and YARN
* Hive 3.1.3
* Spark 3.2.1
* Flink 1.12.2
* ZooKeeper 3.5.9

Expand Down Expand Up @@ -245,10 +245,10 @@ drwxrwxrwx - root supergroup 0 2022-07-31 02:48 /user

[root@ldh-96bdc757c-dnkbs /]# beeline -u jdbc:hive2://ldh.ldh.svc.cluster.local:10000/ -n hadoop
Connecting to jdbc:hive2://ldh.ldh.svc.cluster.local:10000/
Connected to: Apache Hive (version 2.3.3)
Driver: Hive JDBC (version 2.3.3)
Connected to: Apache Hive (version 3.1.3)
Driver: Hive JDBC (version 3.1.3)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.3 by Apache Hive
Beeline version 3.1.3 by Apache Hive
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> create database demo;
No rows affected (1.306 seconds)
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> use demo;
Expand All @@ -271,7 +271,7 @@ No rows affected (5.491 seconds)
22/07/31 02:53:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://ldh.ldh.svc.cluster.local:9083
22/07/31 02:53:18 INFO hive.metastore: Connected to metastore.
...
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 2.4.3
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 3.2.1
22/07/31 02:53:19 INFO spark.SparkContext: Submitted application: SparkSQL::10.244.0.6
...
22/07/31 02:53:27 INFO yarn.Client: Submitting application application_1659235712576_0001 to ResourceManager
Expand Down
Loading

0 comments on commit eb55412

Please sign in to comment.