Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the problem of Sonar code scanning #4369

Open
wants to merge 14 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
[feat] upgrade hadoop\spark\hive default vertion to 3.x (#4263)
* upgrade hive\spark\hadoop to default 3.x
GuoPhilipse authored Mar 6, 2023
commit eb55412389c70cd5b9cb8e39f195cea4bf786842
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -85,8 +85,8 @@ Since the first release of Linkis in 2019, it has accumulated more than **700**

| **Engine Name** | **Suppor Component Version<br/>(Default Dependent Version)** | **Linkis Version Requirements** | **Included in Release Package<br/> By Default** | **Description** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 2.4.3)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 2.3.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 3.2.1)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 3.1.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Python|Python >= 2.6, <br/>(default Python2*)|\>=1.0.3|Yes |Python EngineConn, supports python code|
|Shell|Bash >= 2.0|\>=1.0.3|Yes|Shell EngineConn, supports Bash shell code|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(default Hive-jdbc 2.3.4)|\>=1.0.3|No|JDBC EngineConn, already supports MySQL and HiveQL, can be extended quickly Support other engines with JDBC Driver package, such as Oracle|
4 changes: 2 additions & 2 deletions README_CN.md
Original file line number Diff line number Diff line change
@@ -82,8 +82,8 @@ Linkis 自 2019 年开源发布以来,已累计积累了 700 多家试验企

| **引擎名** | **支持底层组件版本 <br/>(默认依赖版本)** | **Linkis 版本要求** | **是否默认包含在发布包中** | **说明** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 2.4.3|\>=1.0.3||Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 2.3.3)|\>=1.0.3||Hive EngineConn, 支持 HiveQL 代码|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 3.2.1|\>=1.0.3||Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 3.1.3)|\>=1.0.3||Hive EngineConn, 支持 HiveQL 代码|
|Python|Python >= 2.6, <br/>(默认 Python2*|\>=1.0.3||Python EngineConn, 支持 python 代码|
|Shell|Bash >= 2.0|\>=1.0.3||Shell EngineConn, 支持 Bash shell 代码|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(默认 Hive-jdbc 2.3.4)|\>=1.0.3||JDBC EngineConn, 已支持 MySQL 和 HiveQL,可快速扩展支持其他有 JDBC Driver 包的引擎, 如 Oracle|
4 changes: 2 additions & 2 deletions docs/configuration/linkis-computation-governance-common.md
Original file line number Diff line number Diff line change
@@ -4,8 +4,8 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |
| -------- | -------- | ----- |----- |
|linkis-computation-governance-common|wds.linkis.rm| | wds.linkis.rm |
|linkis-computation-governance-common|wds.linkis.spark.engine.version|2.4.3 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 1.2.1 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.spark.engine.version|3.2.1 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 3.1.3 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.python.engine.version|python2 | python.engine.version |
|linkis-computation-governance-common|wds.linkis.python.code_parser.enabled| false |python.code_parser.enabled|
|linkis-computation-governance-common|wds.linkis.scala.code_parser.enabled| false | scala.code_parser.enabled |
2 changes: 1 addition & 1 deletion docs/configuration/linkis-manager-common.md
Original file line number Diff line number Diff line change
@@ -4,7 +4,7 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-manager-common|wds.linkis.default.engine.type |spark|engine.type|
|linkis-manager-common|wds.linkis.default.engine.version |2.4.3|engine.version|
|linkis-manager-common|wds.linkis.default.engine.version |3.2.1|engine.version|
|linkis-manager-common|wds.linkis.manager.admin|hadoop|manager.admin|
|linkis-manager-common|wds.linkis.rm.application.name|ResourceManager|rm.application.name|
|linkis-manager-common|wds.linkis.rm.wait.event.time.out| 1000 * 60 * 12L |event.time.out|
2 changes: 1 addition & 1 deletion docs/configuration/linkis-udf.md
Original file line number Diff line number Diff line change
@@ -3,7 +3,7 @@

| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-1.2.1.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-3.1.3.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.tmp.path|/tmp/udf/|udf.tmp.path|
|linkis-udf|wds.linkis.udf.share.path|/mnt/bdap/udf/|udf.share.path|
|linkis-udf|wds.linkis.udf.share.proxy.user| hadoop|udf.share.proxy.user|
2 changes: 1 addition & 1 deletion docs/errorcode/linkis-configuration-errorcode.md
Original file line number Diff line number Diff line change
@@ -15,7 +15,7 @@
|linkis-configuration |14100|CategoryName cannot be included '-'(类别名称不能包含 '-')|CANNOT_BE_INCLUDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Creator is null, cannot be added(创建者为空,无法添加)|CREATOR_IS_NULL_CANNOT_BE_ADDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Engine type is null, cannot be added(引擎类型为空,无法添加)|ENGINE_TYPE_IS_NULL|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-2.4.3(保存的引擎类型参数有误,请按照固定格式传送,例如spark-2.4.3)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-3.2.1(保存的引擎类型参数有误,请按照固定格式传送,例如spark-3.2.1)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Incomplete request parameters, please reconfirm(请求参数不完整,请重新确认)|INCOMPLETE_RECONFIRM|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Only admin can modify category(只有管理员才能修改目录)|ONLY_ADMIN_CAN_MODIFY|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The label parameter is empty(标签参数为空)|THE_LABEL_PARAMETER_IS_EMPTY|LinkisConfigurationErrorCodeSummary|
2 changes: 1 addition & 1 deletion docs/trino-usage.md
Original file line number Diff line number Diff line change
@@ -46,7 +46,7 @@ Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入

```
linkis_ps_configuration_config_key: 插入引擎的配置参数的key和默认values
linkis_cg_manager_label:插入引擎label如:hive-2.3.3
linkis_cg_manager_label:插入引擎label如:hive-3.1.3
linkis_ps_configuration_category: 插入引擎的目录关联关系
linkis_ps_configuration_config_value: 插入引擎需要展示的配置
linkis_ps_configuration_key_engine_relation:配置项和引擎的关联关系
Original file line number Diff line number Diff line change
@@ -24,7 +24,7 @@ public class UJESConstants {
public static final String QUERY_PAGE_SIZE_NAME = "pageSize";
public static final int QUERY_PAGE_SIZE_DEFAULT_VALUE = 100;

public static final Long DRIVER_QUERY_SLEEP_MILLS = 500l;
public static final Long DRIVER_QUERY_SLEEP_MILLS = 500L;
public static final Integer DRIVER_REQUEST_MAX_RETRY_TIME = 3;

public static final String QUERY_STATUS_NAME = "status";
@@ -40,7 +40,4 @@ public class UJESConstants {
public static final Integer IDX_FOR_LOG_TYPE_ALL = 3; // 0: Error 1: WARN 2:INFO 3: ALL

public static final int DEFAULT_PAGE_SIZE = 500;

public static final String DEFAULT_SPARK_ENGINE = "spark-2.4.3";
public static final String DEFAULT_HIVE_ENGINE = "hive-1.2.1";
}
Original file line number Diff line number Diff line change
@@ -85,12 +85,12 @@ public void before() {

/* Test different task type */

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "sql",
// "-code", "show tables;show tables;show tables",

//
// "-engineType", "hive-1.2.1",
// "-engineType", "hive-3.1.3",
// "-codeType", "sql",
// "-code", "show tables;",

@@ -101,11 +101,11 @@ public void before() {
"-code",
"whoami",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "py",
// "-code", "print ('hello')",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "scala",
// "-codePath", "src/test/resources/testScala.scala",

Original file line number Diff line number Diff line change
@@ -18,6 +18,7 @@
package org.apache.linkis.computation.client;

import org.apache.linkis.computation.client.interactive.SubmittableInteractiveJob;
import org.apache.linkis.manager.label.conf.LabelCommonConfig;

/** A test class for submit a sql to hive engineConn. */
public class InteractiveJobTest {
@@ -29,7 +30,7 @@ public static void main(String[] args) {
SubmittableInteractiveJob job =
LinkisJobClient.interactive()
.builder()
.setEngineType("hive-2.3.3")
.setEngineType("hive-" + LabelCommonConfig.HIVE_ENGINE_VERSION.getValue())
.setRunTypeStr("sql")
.setCreator("IDE")
.setCode("show tables")
Original file line number Diff line number Diff line change
@@ -18,14 +18,17 @@
package org.apache.linkis.governance.common.conf

import org.apache.linkis.common.conf.{CommonVars, Configuration}
import org.apache.linkis.manager.label.conf.LabelCommonConfig

object GovernanceCommonConf {

val CONF_FILTER_RM = "wds.linkis.rm"

val SPARK_ENGINE_VERSION = CommonVars("wds.linkis.spark.engine.version", "2.4.3")
val SPARK_ENGINE_VERSION =
CommonVars("wds.linkis.spark.engine.version", LabelCommonConfig.SPARK_ENGINE_VERSION.getValue)

val HIVE_ENGINE_VERSION = CommonVars("wds.linkis.hive.engine.version", "1.2.1")
val HIVE_ENGINE_VERSION =
CommonVars("wds.linkis.hive.engine.version", LabelCommonConfig.HIVE_ENGINE_VERSION.getValue)

val PYTHON_ENGINE_VERSION = CommonVars("wds.linkis.python.engine.version", "python2")

Original file line number Diff line number Diff line change
@@ -42,8 +42,8 @@ class GovernanceCommonConfTest {
val errorcodedesclen = GovernanceCommonConf.ERROR_CODE_DESC_LEN

Assertions.assertEquals("wds.linkis.rm", conffilterrm)
Assertions.assertEquals("2.4.3", sparkengineversion)
Assertions.assertEquals("1.2.1", hiveengineversion)
Assertions.assertEquals("3.2.1", sparkengineversion)
Assertions.assertEquals("3.1.3", hiveengineversion)
Assertions.assertEquals("python2", pythonengineversion)
Assertions.assertFalse(pythoncodeparserswitch)
Assertions.assertFalse(scalacodeparserswitch)
Original file line number Diff line number Diff line change
@@ -28,6 +28,7 @@ import org.apache.linkis.manager.label.builder.factory.{
LabelBuilderFactory,
LabelBuilderFactoryContext
}
import org.apache.linkis.manager.label.conf.LabelCommonConfig
import org.apache.linkis.manager.label.constant.LabelKeyConstant
import org.apache.linkis.manager.label.entity.Label
import org.apache.linkis.manager.label.entity.engine.{CodeLanguageLabel, UserCreatorLabel}
@@ -134,7 +135,8 @@ class CommonEntranceParser(val persistenceManager: PersistenceManager)
private def checkEngineTypeLabel(labels: util.Map[String, Label[_]]): Unit = {
val engineTypeLabel = labels.getOrDefault(LabelKeyConstant.ENGINE_TYPE_KEY, null)
if (null == engineTypeLabel) {
val msg = s"You need to specify engineTypeLabel in labels, such as spark-2.4.3"
val msg = s"You need to specify engineTypeLabel in labels," +
s"such as spark-${LabelCommonConfig.SPARK_ENGINE_VERSION.getValue}"
throw new EntranceIllegalParamException(
EntranceErrorCode.LABEL_PARAMS_INVALID.getErrCode,
EntranceErrorCode.LABEL_PARAMS_INVALID.getDesc + msg
Original file line number Diff line number Diff line change
@@ -34,10 +34,10 @@ public class LabelCommonConfig {
CommonVars.apply("wds.linkis.label.entity.packages", "");

public static final CommonVars<String> SPARK_ENGINE_VERSION =
CommonVars.apply("wds.linkis.spark.engine.version", "2.4.3");
CommonVars.apply("wds.linkis.spark.engine.version", "3.2.1");

public static final CommonVars<String> HIVE_ENGINE_VERSION =
CommonVars.apply("wds.linkis.hive.engine.version", "2.3.3");
CommonVars.apply("wds.linkis.hive.engine.version", "3.1.3");

public static final CommonVars<String> PYTHON_ENGINE_VERSION =
CommonVars.apply("wds.linkis.python.engine.version", "python2");
Original file line number Diff line number Diff line change
@@ -19,6 +19,7 @@

import org.apache.linkis.manager.label.builder.factory.LabelBuilderFactory;
import org.apache.linkis.manager.label.builder.factory.LabelBuilderFactoryContext;
import org.apache.linkis.manager.label.conf.LabelCommonConfig;
import org.apache.linkis.manager.label.entity.Label;
import org.apache.linkis.manager.label.entity.node.AliasServiceInstanceLabel;
import org.apache.linkis.manager.label.exception.LabelErrorException;
@@ -27,7 +28,9 @@ public class TestLabelBuilder {

public static void main(String[] args) throws LabelErrorException {
LabelBuilderFactory labelBuilderFactory = LabelBuilderFactoryContext.getLabelBuilderFactory();
Label<?> engineType = labelBuilderFactory.createLabel("engineType", "hive-1.2.1");
Label<?> engineType =
labelBuilderFactory.createLabel(
"engineType", "hive-" + LabelCommonConfig.HIVE_ENGINE_VERSION.getValue());
System.out.println(engineType.getFeature());

AliasServiceInstanceLabel emInstanceLabel =
Original file line number Diff line number Diff line change
@@ -18,12 +18,16 @@
package org.apache.linkis.manager.common.conf

import org.apache.linkis.common.conf.CommonVars
import org.apache.linkis.manager.label.conf.LabelCommonConfig

object ManagerCommonConf {

val DEFAULT_ENGINE_TYPE = CommonVars("wds.linkis.default.engine.type", "spark")

val DEFAULT_ENGINE_VERSION = CommonVars("wds.linkis.default.engine.version", "2.4.3")
val DEFAULT_ENGINE_VERSION = CommonVars(
"wds.linkis.default.engine.version",
LabelCommonConfig.SPARK_ENGINE_VERSION.defaultValue
)

val DEFAULT_ADMIN = CommonVars("wds.linkis.manager.admin", "hadoop")

Original file line number Diff line number Diff line change
@@ -71,7 +71,7 @@
<where>
<if test="instance != null"> service_instance = #{instance}</if>
<if test="username != null"> and create_user = #{username}</if>
<!-- label_value in db eg:`hadoop-spark,spark-2.4.3`-->
<!-- label_value in db eg:`hadoop-spark,spark-3.2.1`-->
<if test="engineType !=null">and label_value like concat('%,',#{engineType},'%')</if>
<if test="startDate != null">and create_time BETWEEN #{startDate} AND #{endDate}</if>
</where>
@@ -93,7 +93,7 @@
</if>

<if test="engineTypes != null and engineTypes.size() > 0">
<!-- label_value in db eg:`hadoop-spark,spark-2.4.3`-->
<!-- label_value in db eg:`hadoop-spark,spark-3.2.1`-->
and SUBSTRING_INDEX(SUBSTRING_INDEX(ecr.label_value,',',-1),"-",1) in
<foreach collection="engineTypes" item="i" open="(" close=")" separator=",">
#{i}
2 changes: 1 addition & 1 deletion linkis-dist/bin/checkEnv.sh
Original file line number Diff line number Diff line change
@@ -37,7 +37,7 @@ function checkPythonAndJava(){

function checkHdfs(){
hadoopVersion="`hdfs version`"
defaultHadoopVersion="2.7"
defaultHadoopVersion="3.3"
checkversion "$hadoopVersion" $defaultHadoopVersion hadoop
}

4 changes: 2 additions & 2 deletions linkis-dist/bin/install.sh
Original file line number Diff line number Diff line change
@@ -219,13 +219,13 @@ SERVER_IP=$local_host
##Label set start
if [ "$SPARK_VERSION" != "" ]
then
sed -i ${txt} "s#spark-2.4.3#spark-$SPARK_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#spark-3.2.1#spark-$SPARK_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#\#wds.linkis.spark.engine.version.*#wds.linkis.spark.engine.version=$SPARK_VERSION#g" $common_conf
fi

if [ "$HIVE_VERSION" != "" ]
then
sed -i ${txt} "s#hive-2.3.3#hive-$HIVE_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#hive-3.1.3#hive-$HIVE_VERSION#g" $LINKIS_HOME/db/linkis_dml.sql
sed -i ${txt} "s#\#wds.linkis.hive.engine.version.*#wds.linkis.hive.engine.version=$HIVE_VERSION#g" $common_conf
fi

6 changes: 3 additions & 3 deletions linkis-dist/deploy-config/linkis-env.sh
Original file line number Diff line number Diff line change
@@ -78,7 +78,7 @@ HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/appcom/config/hadoop-config"}
HADOOP_KERBEROS_ENABLE=${HADOOP_KERBEROS_ENABLE:-"false"}
HADOOP_KEYTAB_PATH=${HADOOP_KEYTAB_PATH:-"/appcom/keytab/"}
## Hadoop env version
HADOOP_VERSION=${HADOOP_VERSION:-"2.7.2"}
HADOOP_VERSION=${HADOOP_VERSION:-"3.3.4"}

#Hive
HIVE_HOME=/appcom/Install/hive
@@ -91,10 +91,10 @@ SPARK_CONF_DIR=/appcom/config/spark-config

## Engine version conf
#SPARK_VERSION
#SPARK_VERSION=2.4.3
#SPARK_VERSION=3.2.1

##HIVE_VERSION
#HIVE_VERSION=2.3.3
#HIVE_VERSION=3.1.3

#PYTHON_VERSION=python2

8 changes: 4 additions & 4 deletions linkis-dist/docker/ldh.Dockerfile
Original file line number Diff line number Diff line change
@@ -27,10 +27,10 @@ ARG JDK_VERSION=1.8.0-openjdk
ARG JDK_BUILD_REVISION=1.8.0.332.b09-1.el7_9
ARG MYSQL_JDBC_VERSION=8.0.28

ARG HADOOP_VERSION=2.7.2
ARG HIVE_VERSION=2.3.3
ARG SPARK_VERSION=2.4.3
ARG SPARK_HADOOP_VERSION=2.7
ARG HADOOP_VERSION=3.3.4
ARG HIVE_VERSION=3.1.3
ARG SPARK_VERSION=3.2.1
ARG SPARK_HADOOP_VERSION=3.2
ARG FLINK_VERSION=1.12.2
ARG ZOOKEEPER_VERSION=3.5.9

8 changes: 4 additions & 4 deletions linkis-dist/docker/scripts/prepare-ldh-image.sh
Original file line number Diff line number Diff line change
@@ -27,10 +27,10 @@ rm -rf ${LDH_TAR_DIR} && mkdir -p ${LDH_TAR_DIR}
rm -rf ${PROJECT_TARGET}/entry-point-ldh.sh
cp ${WORK_DIR}/entry-point-ldh.sh ${PROJECT_TARGET}/

HADOOP_VERSION=${HADOOP_VERSION:-2.7.2}
HIVE_VERSION=${HIVE_VERSION:-2.3.3}
SPARK_VERSION=${SPARK_VERSION:-2.4.3}
SPARK_HADOOP_VERSION=${SPARK_HADOOP_VERSION:-2.7}
HADOOP_VERSION=${HADOOP_VERSION:-3.3.4}
HIVE_VERSION=${HIVE_VERSION:-3.1.3}
SPARK_VERSION=${SPARK_VERSION:-3.2.1}
SPARK_HADOOP_VERSION=${SPARK_HADOOP_VERSION:-3.2}
FLINK_VERSION=${FLINK_VERSION:-1.12.2}
ZOOKEEPER_VERSION=${ZOOKEEPER_VERSION:-3.5.9}
MYSQL_JDBC_VERSION=${MYSQL_JDBC_VERSION:-8.0.28}
14 changes: 7 additions & 7 deletions linkis-dist/helm/README.md
Original file line number Diff line number Diff line change
@@ -201,9 +201,9 @@ $> kind delete cluster --name test-helm

We introduced a new image, called LDH (Linkis's hadoop all-in-one image), which provides a pseudo-distributed hadoop cluster for testing quickly. This image contains the following hadoop components, the default mode for engines in LDH is on-yarn.

* Hadoop 2.7.2 , including HDFS and YARN
* Hive 2.3.3
* Spark 2.4.3
* Hadoop 3.3.4 , including HDFS and YARN
* Hive 3.1.3
* Spark 3.2.1
* Flink 1.12.2
* ZooKeeper 3.5.9

@@ -245,10 +245,10 @@ drwxrwxrwx - root supergroup 0 2022-07-31 02:48 /user

[root@ldh-96bdc757c-dnkbs /]# beeline -u jdbc:hive2://ldh.ldh.svc.cluster.local:10000/ -n hadoop
Connecting to jdbc:hive2://ldh.ldh.svc.cluster.local:10000/
Connected to: Apache Hive (version 2.3.3)
Driver: Hive JDBC (version 2.3.3)
Connected to: Apache Hive (version 3.1.3)
Driver: Hive JDBC (version 3.1.3)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.3 by Apache Hive
Beeline version 3.1.3 by Apache Hive
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> create database demo;
No rows affected (1.306 seconds)
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> use demo;
@@ -271,7 +271,7 @@ No rows affected (5.491 seconds)
22/07/31 02:53:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://ldh.ldh.svc.cluster.local:9083
22/07/31 02:53:18 INFO hive.metastore: Connected to metastore.
...
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 2.4.3
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 3.2.1
22/07/31 02:53:19 INFO spark.SparkContext: Submitted application: SparkSQL::10.244.0.6
...
22/07/31 02:53:27 INFO yarn.Client: Submitting application application_1659235712576_0001 to ResourceManager
14 changes: 7 additions & 7 deletions linkis-dist/helm/README_CN.md
Original file line number Diff line number Diff line change
@@ -190,9 +190,9 @@ $> kind delete cluster --name test-helm
## 使用 LDH 进行测试
我们引入了一个新的镜像,叫做LDH(Linkis 的 hadoop 一体式镜像),它提供了一个伪分布式的 hadoop 集群,方便快速测试 On Hadoop 的部署模式。
这个镜像包含以下多个 hadoop 组件,LDH 中引擎的默认模式是 on-yarn 的。
* Hadoop 2.7.2 , 包括 HDFS and YARN
* Hive 2.3.3
* Spark 2.4.3
* Hadoop 3.3.4 , 包括 HDFS and YARN
* Hive 3.1.3
* Spark 3.2.1
* Flink 1.12.2
* ZooKeeper 3.5.9

@@ -236,10 +236,10 @@ drwxrwxrwx - root supergroup 0 2022-07-31 02:48 /user

[root@ldh-96bdc757c-dnkbs /]# beeline -u jdbc:hive2://ldh.ldh.svc.cluster.local:10000/ -n hadoop
Connecting to jdbc:hive2://ldh.ldh.svc.cluster.local:10000/
Connected to: Apache Hive (version 2.3.3)
Driver: Hive JDBC (version 2.3.3)
Connected to: Apache Hive (version 3.1.3)
Driver: Hive JDBC (version 3.1.3)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.3 by Apache Hive
Beeline version 3.1.3 by Apache Hive
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> create database demo;
No rows affected (1.306 seconds)
0: jdbc:hive2://ldh.ldh.svc.cluster.local:100> use demo;
@@ -262,7 +262,7 @@ No rows affected (5.491 seconds)
22/07/31 02:53:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://ldh.ldh.svc.cluster.local:9083
22/07/31 02:53:18 INFO hive.metastore: Connected to metastore.
...
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 2.4.3
22/07/31 02:53:19 INFO spark.SparkContext: Running Spark version 3.2.1
22/07/31 02:53:19 INFO spark.SparkContext: Submitted application: SparkSQL::10.244.0.6
...
22/07/31 02:53:27 INFO yarn.Client: Submitting application application_1659235712576_0001 to ResourceManager
Original file line number Diff line number Diff line change
@@ -1183,12 +1183,12 @@ data:
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = '*-*,*-*');
-- spark2.4.3 default configuration
-- spark default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @SPARK_ALL);
-- hive1.2.1 default configuration
-- hive default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @HIVE_ALL);
6 changes: 3 additions & 3 deletions linkis-dist/helm/charts/linkis/values.yaml
Original file line number Diff line number Diff line change
@@ -111,7 +111,7 @@ linkis:
python:
version: 2.7
hadoop:
version: 2.7.2
version: 3.3.4
configMapName: hadoop-conf
yarn:
restfulUrl: http://ldh.ldh.svc.cluster.local:8088
@@ -123,10 +123,10 @@ linkis:
keytab: /etc/hadoop-conf/yarn.keytab
krb5: /etc/krb5.keytab
spark:
version: 2.4.3
version: 3.2.1
configMapName: spark-conf
hive:
version: 2.3.3
version: 3.1.3
configMapName: hive-conf
meta:
url: "jdbc:mysql://mysql.mysql.svc.cluster.local:3306/hive_metadata?&amp;createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8&amp;useSSL=false" # jdbc:mysql://localhost:3306/metastore?useUnicode=true
8 changes: 4 additions & 4 deletions linkis-dist/helm/scripts/prepare-for-spark.sh
Original file line number Diff line number Diff line change
@@ -28,10 +28,10 @@ ECM_POD_NAME=`kubectl get pods -n linkis -l app.kubernetes.io/instance=linkis-de
kubectl cp ./ldh -n linkis ${ECM_POD_NAME}:/opt/ ;


kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "chmod +x /opt/ldh/1.3.0/spark-2.4.3-bin-hadoop2.7/bin/*"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/spark-2.4.3-bin-hadoop2.7 /opt/ldh/current/spark"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/hadoop-2.7.2 /opt/ldh/current/hadoop"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/apache-hive-2.3.3-bin /opt/ldh/current/hive"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "chmod +x /opt/ldh/1.3.0/spark-3.2.1-bin-hadoop3.2/bin/*"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/spark-3.2.1-bin-hadoop3.2 /opt/ldh/current/spark"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/hadoop-3.3.4 /opt/ldh/current/hadoop"
kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "ln -s /opt/ldh/1.3.0/apache-hive-3.1.3-bin /opt/ldh/current/hive"


kubectl exec -it -n linkis ${ECM_POD_NAME} -- bash -c "echo 'export SPARK_HOME=/opt/ldh/current/spark' |sudo tee --append /etc/profile"
2 changes: 1 addition & 1 deletion linkis-dist/package/bin/linkis-cli-hive
Original file line number Diff line number Diff line change
@@ -161,6 +161,6 @@ else
parse
fi

exec ${WORK_DIR}/bin/linkis-cli-pre -engineType hive-2.3.3 -codeType hql "${PARSED_CMD[@]}"
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType hive-3.1.3 -codeType hql "${PARSED_CMD[@]}"


6 changes: 3 additions & 3 deletions linkis-dist/package/bin/linkis-cli-spark-submit
Original file line number Diff line number Diff line change
@@ -192,9 +192,9 @@ else
fi

if [ "$IS_PYSPARK"x == "true"x ]; then
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-2.4.3 -codeType py "${PARSED_CMD[@]}"
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-3.2.1 -codeType py "${PARSED_CMD[@]}"
elif [ "IS_SCALA"x == "true"x ]; then
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-2.4.3 -codeType scala "${PARSED_CMD[@]}"
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-3.2.1 -codeType scala "${PARSED_CMD[@]}"
else
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-2.4.3 "${PARSED_CMD[@]}"
exec ${WORK_DIR}/bin/linkis-cli-pre -engineType spark-3.2.1 "${PARSED_CMD[@]}"
fi
14 changes: 7 additions & 7 deletions linkis-dist/package/db/linkis_dml.sql
Original file line number Diff line number Diff line change
@@ -18,8 +18,8 @@


-- 变量:
SET @SPARK_LABEL="spark-2.4.3";
SET @HIVE_LABEL="hive-2.3.3";
SET @SPARK_LABEL="spark-3.2.1";
SET @HIVE_LABEL="hive-3.1.3";
SET @PYTHON_LABEL="python-python2";
SET @PIPELINE_LABEL="pipeline-1";
SET @JDBC_LABEL="jdbc-4";
@@ -189,18 +189,18 @@ insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_featur
insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES ('combined_userCreator_engineType', @PRESTO_ALL, 'OPTIONAL', 2, now(), now());
insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES ('combined_userCreator_engineType', @TRINO_ALL, 'OPTIONAL', 2, now(), now());

-- Custom correlation engine (e.g. spark-2.4.3) and configKey value
-- Custom correlation engine (e.g. spark) and configKey value
-- Global Settings
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type is null and label.label_value = "*-*,*-*");

-- spark-2.4.3(Here choose to associate all spark type Key values with spark2.4.3)
-- spark(Here choose to associate all spark type Key values with spark)
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type = 'spark' and label.label_value = @SPARK_ALL);

-- hive-1.2.1
-- hive
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type = 'hive' and label_value = @HIVE_ALL);
@@ -318,12 +318,12 @@ insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_val
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = '*-*,*-*');

-- spark2.4.3 default configuration
-- spark default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @SPARK_ALL);

-- hive1.2.1 default configuration
-- hive default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @HIVE_ALL);
14 changes: 7 additions & 7 deletions linkis-dist/package/db/module/linkis_configuration_dml.sql
Original file line number Diff line number Diff line change
@@ -18,8 +18,8 @@


-- 变量:
SET @SPARK_LABEL="spark-2.4.3";
SET @HIVE_LABEL="hive-1.2.1";
SET @SPARK_LABEL="spark-3.2.1";
SET @HIVE_LABEL="hive-3.1.3";
SET @PYTHON_LABEL="python-python2";
SET @PIPELINE_LABEL="pipeline-*";
SET @JDBC_LABEL="jdbc-4";
@@ -109,18 +109,18 @@ insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_featur
insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES ('combined_userCreator_engineType',@PIPELINE_ALL, 'OPTIONAL', 2, now(), now());
insert into `linkis_cg_manager_label` (`label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES ('combined_userCreator_engineType',@JDBC_ALL, 'OPTIONAL', 2, now(), now());

-- Custom correlation engine (e.g. spark-2.4.3) and configKey value
-- Custom correlation engine (e.g. spark) and configKey value
-- Global Settings
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type is null and label.label_value = "*-*,*-*");

-- spark-2.4.3(Here choose to associate all spark type Key values with spark2.4.3)
-- spark(Here choose to associate all spark type Key values with spark)
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type = 'spark' and label.label_value = @SPARK_ALL);

-- hive-1.2.1
-- hive
insert into `linkis_ps_configuration_key_engine_relation` (`config_key_id`, `engine_type_label_id`)
(select config.id as `config_key_id`, label.id AS `engine_type_label_id` FROM linkis_ps_configuration_config_key config
INNER JOIN linkis_cg_manager_label label ON config.engine_conn_type = 'hive' and label_value = @HIVE_ALL);
@@ -206,12 +206,12 @@ insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_val
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = '*-*,*-*');

-- spark2.4.3 default configuration
-- spark default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @SPARK_ALL);

-- hive1.2.1 default configuration
-- hive default configuration
insert into `linkis_ps_configuration_config_value` (`config_key_id`, `config_value`, `config_label_id`)
(select `relation`.`config_key_id` AS `config_key_id`, '' AS `config_value`, `relation`.`engine_type_label_id` AS `config_label_id` FROM linkis_ps_configuration_key_engine_relation relation
INNER JOIN linkis_cg_manager_label label ON relation.engine_type_label_id = label.id AND label.label_value = @HIVE_ALL);
8 changes: 4 additions & 4 deletions linkis-dist/pom.xml
Original file line number Diff line number Diff line change
@@ -211,10 +211,10 @@
<linkis.home>/opt/linkis</linkis.home>
<linkis.conf.dir>/etc/linkis-conf</linkis.conf.dir>
<linkis.log.dir>/var/logs/linkis</linkis.log.dir>
<ldh.hadoop.version>2.7.2</ldh.hadoop.version>
<ldh.hive.version>2.3.3</ldh.hive.version>
<ldh.spark.version>2.4.3</ldh.spark.version>
<ldh.spark.hadoop.version>2.7</ldh.spark.hadoop.version>
<ldh.hadoop.version>3.3.4</ldh.hadoop.version>
<ldh.hive.version>3.1.3</ldh.hive.version>
<ldh.spark.version>3.2.1</ldh.spark.version>
<ldh.spark.hadoop.version>3.2</ldh.spark.hadoop.version>
<ldh.flink.version>1.12.2</ldh.flink.version>
<ldh.zookeeper.version>3.5.9</ldh.zookeeper.version>
</properties>
113 changes: 12 additions & 101 deletions linkis-engineconn-plugins/spark/pom.xml
Original file line number Diff line number Diff line change
@@ -435,16 +435,21 @@
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-client</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
<groupId>${spark.hadoop.groupid}</groupId>
<artifactId>${spark.hadoop-common.artifactId}</artifactId>
<version>${spark.hadoop.version}</version>
<scope>${spark.hadoop.scope}</scope>
</dependency>
<dependency>
<groupId>${spark.hadoop.groupid}</groupId>
<artifactId>${spark.hadoop-hdfs.artifactId}</artifactId>
<version>${spark.hadoop.version}</version>
<scope>${spark.hadoop.scope}</scope>
</dependency>
</dependencies>

@@ -485,98 +490,4 @@
</plugin>
</plugins>
</build>
<profiles>
<!-- spark2-hadoop3 version:spark2.4 use hadoop2.7.2 by default mvn validate -Pspark-2.4-hadoop-3.3 -->
<profile>
<id>spark-2.4-hadoop-3.3</id>
<properties>
<hadoop.version>${hadoop-hdfs-client-shade.version}</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.linkis</groupId>
<artifactId>linkis-hadoop-hdfs-client-shade</artifactId>
<version>${project.version}</version>
<exclusions>
<exclusion>
<groupId>commmons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-json</artifactId>
</exclusion>
<exclusion>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
</exclusion>
<exclusion>
<groupId>xmlenc</groupId>
<artifactId>xmlenc</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
<exclusion>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
</exclusion>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>org.eclipse.jetty</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</profile>
</profiles>
</project>
Original file line number Diff line number Diff line change
@@ -30,7 +30,7 @@ public class EngineLabelResponse implements Serializable {
@ApiModelProperty(value = "label id.")
private Integer labelId;

@ApiModelProperty(value = "engine name. eg: spark-2.4.3")
@ApiModelProperty(value = "engine name. eg: spark-3.2.1")
private String engineName;

@ApiModelProperty(value = "install. eg: yes")
Original file line number Diff line number Diff line change
@@ -49,21 +49,21 @@ INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (3, 'combined_userCreator_engineType', '*-Visualis,*-*', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (4, 'combined_userCreator_engineType', '*-nodeexecution,*-*', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (5, 'combined_userCreator_engineType', '*-*,*-*', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (6, 'combined_userCreator_engineType', '*-*,spark-2.4.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (7, 'combined_userCreator_engineType', '*-*,hive-2.3.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (6, 'combined_userCreator_engineType', '*-*,spark-3.2.1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (7, 'combined_userCreator_engineType', '*-*,hive-3.1.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (8, 'combined_userCreator_engineType', '*-*,python-python2', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (9, 'combined_userCreator_engineType', '*-*,pipeline-1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (10, 'combined_userCreator_engineType', '*-*,jdbc-4', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (11, 'combined_userCreator_engineType', '*-*,openlookeng-1.5.0', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (12, 'combined_userCreator_engineType', '*-IDE,spark-2.4.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (13, 'combined_userCreator_engineType', '*-IDE,hive-2.3.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (12, 'combined_userCreator_engineType', '*-IDE,spark-3.2.1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (13, 'combined_userCreator_engineType', '*-IDE,hive-3.1.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (14, 'combined_userCreator_engineType', '*-IDE,python-python2', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (15, 'combined_userCreator_engineType', '*-IDE,pipeline-1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (16, 'combined_userCreator_engineType', '*-IDE,jdbc-4', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (17, 'combined_userCreator_engineType', '*-IDE,openlookeng-1.5.0', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (18, 'combined_userCreator_engineType', '*-Visualis,spark-2.4.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (19, 'combined_userCreator_engineType', '*-nodeexecution,spark-2.4.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (20, 'combined_userCreator_engineType', '*-nodeexecution,hive-2.3.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (18, 'combined_userCreator_engineType', '*-Visualis,spark-3.2.1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (19, 'combined_userCreator_engineType', '*-nodeexecution,spark-3.2.1', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (20, 'combined_userCreator_engineType', '*-nodeexecution,hive-3.1.3', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');
INSERT INTO `linkis_cg_manager_label` (`id`, `label_key`, `label_value`, `label_feature`, `label_value_size`, `update_time`, `create_time`) VALUES (21, 'combined_userCreator_engineType', '*-nodeexecution,python-python2', 'OPTIONAL', 2, '2022-11-24 20:46:21', '2022-11-24 20:46:21');


Original file line number Diff line number Diff line change
@@ -39,7 +39,7 @@ public enum LinkisConfigurationErrorCodeSummary implements LinkisErrorCode {
ENGINE_TYPE_IS_NULL(14100, "Engine type is null, cannot be added(引擎类型为空,无法添加)"),
INCORRECT_FIXED_SUCH(
14100,
"The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-2.4.3(保存的引擎类型参数有误,请按照固定格式传送,例如spark-2.4.3)"),
"The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-3.2.1(保存的引擎类型参数有误,请按照固定格式传送,例如spark-3.2.1)"),
INCOMPLETE_RECONFIRM(14100, "Incomplete request parameters, please reconfirm(请求参数不完整,请重新确认)"),
ONLY_ADMIN_CAN_MODIFY(14100, "Only admin can modify category(只有管理员才能修改目录)"),
THE_LABEL_PARAMETER_IS_EMPTY(14100, " The label parameter is empty(标签参数为空)"),
Original file line number Diff line number Diff line change
@@ -55,9 +55,9 @@ public class ConfigurationRestfulApiTest {
public void TestAddKeyForEngine() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("engineType", "spark");
paramsMap.add("version", "2.4.3");
paramsMap.add("version", "3.2.1");
paramsMap.add("token", "e8724-e");
paramsMap.add("keyJson", "{'engineType':'spark','version':'2.4.3'}");
paramsMap.add("keyJson", "{'engineType':'spark','version':'3.2.1'}");
String url = "/configuration/addKeyForEngine";
sendUrl(url, paramsMap, "get", null);
}
@@ -66,7 +66,7 @@ public void TestAddKeyForEngine() throws Exception {
public void TestGetFullTreesByAppName() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("engineType", "spark");
paramsMap.add("version", "2.4.3");
paramsMap.add("version", "3.2.1");
paramsMap.add("creator", "sam");
String url = "/configuration/getFullTreesByAppName";

@@ -127,7 +127,7 @@ public void TestSaveFullTree() throws Exception {
// " }\n" +
// " ],\n" +
// " \"creator\": \"LINKISCLI\",\n" +
// " \"engineType\": \"hive-2.3.3\"\n" +
// " \"engineType\": \"hive-3.1.3\"\n" +
// "}";
// String url = "/configuration/saveFullTree";
//
2 changes: 1 addition & 1 deletion linkis-web/src/apps/linkis/module/setting/setting.vue
Original file line number Diff line number Diff line change
@@ -310,7 +310,7 @@ export default {
{
creator: parameter[0], // Specify a first-level directory(指定一级目录)
engineType: parameter[1], // Specify the engine (secondary directory) if there is only a first-level directory, it will be automatically undefined and no parameters will be passed(指定引擎(二级目录)如果只有一级目录则自动为undefined不会传参)
version: parameter[2], // The corresponding engine currently only supports the corresponding version. For example, spark will pass version-2.4.3. If there is only a first-level directory, it will be automatically undefined and no parameters will be passed.(对应的引擎目前只支持对应的版本,如spark就传version-2.4.3,如果只有一级目录则自动为undefined不会传参)
version: parameter[2], // The corresponding engine currently only supports the corresponding version. For example, spark will pass version-3.2.1. If there is only a first-level directory, it will be automatically undefined and no parameters will be passed.(对应的引擎目前只支持对应的版本,如spark就传version-3.2.1,如果只有一级目录则自动为undefined不会传参)
},
"get"
)
47 changes: 25 additions & 22 deletions pom.xml
Original file line number Diff line number Diff line change
@@ -105,22 +105,27 @@
<properties>
<revision>1.3.2-SNAPSHOT</revision>
<jedis.version>2.9.2</jedis.version>
<spark.version>2.4.3</spark.version>
<hive.version>2.3.3</hive.version>
<hadoop.version>2.7.2</hadoop.version>
<hadoop-hdfs-client.artifact>hadoop-hdfs</hadoop-hdfs-client.artifact>
<spark.version>3.2.1</spark.version>
<hive.version>3.1.3</hive.version>
<hadoop.version>3.3.4</hadoop.version>
<hadoop-hdfs-client.artifact>hadoop-hdfs-client</hadoop-hdfs-client.artifact>
<hadoop-hdfs-client-shade.version>2.7.2</hadoop-hdfs-client-shade.version>
<spark.hadoop.groupid>org.apache.hadoop</spark.hadoop.groupid>
<spark.hadoop-common.artifactId>hadoop-common</spark.hadoop-common.artifactId>
<spark.hadoop-hdfs.artifactId>hadoop-hdfs</spark.hadoop-hdfs.artifactId>
<spark.hadoop.version>${hadoop.version}</spark.hadoop.version>
<spark.hadoop.scope>provided</spark.hadoop.scope>
<zookeeper.version>3.5.9</zookeeper.version>
<!-- hadoop 2.7 use curator 2.7.1, hadoop3.3 use curator 4.2.0-->
<curator.version>2.7.1</curator.version>
<curator.version>4.2.0</curator.version>
<guava.version>30.0-jre</guava.version>
<netty.version>4.1.86.Final</netty.version>

<!-- json -->
<gson.version>2.8.9</gson.version>
<jackson-bom.version>2.13.4.20221013</jackson-bom.version>
<!-- spark2.4 use 3.5.3, spark3.2 use 3.7.0-M11 -->
<json4s.version>3.5.3</json4s.version>
<json4s.version>3.7.0-M11</json4s.version>

<jersey.version>1.19.4</jersey.version>
<jersey.servlet.version>2.23.1</jersey.servlet.version>
@@ -175,8 +180,8 @@
<!-- dev env -->
<java.version>1.8</java.version>
<maven.version>3.5.0</maven.version>
<scala.version>2.11.12</scala.version>
<scala.binary.version>2.11</scala.binary.version>
<scala.version>2.12.17</scala.version>
<scala.binary.version>2.12</scala.binary.version>
<ant.version>1.10.12</ant.version>

<!-- maven plugin versions -->
@@ -1352,30 +1357,28 @@
</build>

<profiles>
<!-- hadoop version: mvn validate -Phadoop-3.3 ,when used with spark2.x ,please add -Pspark-2.4-hadoop-3.3 together, More details please check SPARK-23534 -->
<profile>
<id>hadoop-3.3</id>
<properties>
<hadoop.version>3.3.1</hadoop.version>
<curator.version>4.2.0</curator.version>
<hadoop-hdfs-client.artifact>hadoop-hdfs-client</hadoop-hdfs-client.artifact>
</properties>
</profile>
<!-- hadoop version: mvn validate -Phadoop-2.7 -->
<profile>
<id>hadoop-2.7</id>
<properties>
<hadoop.version>2.7.2</hadoop.version>
<curator.version>2.7.1</curator.version>
<hadoop-hdfs-client.artifact>hadoop-hdfs</hadoop-hdfs-client.artifact>
</properties>
</profile>
<!-- spark2.4 use hadoop2.7.2 by default mvn validate -Pspark-2.4 -->
<profile>
<id>spark-3.2</id>
<id>spark-2.4</id>
<properties>
<json4s.version>3.7.0-M11</json4s.version>
<spark.version>3.2.1</spark.version>
<scala.version>2.12.15</scala.version>
<scala.binary.version>2.12</scala.binary.version>
<spark.hadoop.groupid>org.apache.linkis</spark.hadoop.groupid>
<spark.hadoop-common.artifactId>linkis-hadoop-hdfs-client-shade</spark.hadoop-common.artifactId>
<spark.hadoop-hdfs.artifactId>linkis-hadoop-hdfs-client-shade</spark.hadoop-hdfs.artifactId>
<spark.hadoop.version>${project.version}</spark.hadoop.version>
<spark.hadoop.scope>compile</spark.hadoop.scope>
<json4s.version>3.5.3</json4s.version>
<spark.version>2.4.3</spark.version>
<scala.version>2.11.12</scala.version>
<scala.binary.version>2.11</scala.binary.version>
</properties>
</profile>
<!-- jacoco: mvn validate -Pjacoco -->
121 changes: 120 additions & 1 deletion tool/dependencies/known-dependencies.txt
Original file line number Diff line number Diff line change
@@ -585,4 +585,123 @@ seatunnel-core-spark-2.1.2.jar
mongo-java-driver-3.12.7.jar
clickhouse-jdbc-0.3.2-patch11.jar
postgresql-42.3.8.jar

accessors-smart-2.3.1.jar
aircompressor-0.10.jar
akka-actor_2.12-2.5.21.jar
akka-protobuf_2.12-2.5.21.jar
akka-slf4j_2.12-2.5.21.jar
akka-stream_2.12-2.5.21.jar
asm-9.3.jar
avatica-1.11.0.jar
calcite-core-1.16.0.jar
calcite-druid-1.16.0.jar
calcite-linq4j-1.16.0.jar
chill_2.12-0.7.6.jar
commons-configuration2-2.1.1.jar
commons-el-1.0.jar
commons-net-3.6.jar
curator-client-4.2.0.jar
curator-framework-4.2.0.jar
curator-recipes-4.2.0.jar
kerby-util-1.0.1.jar
kerby-xdr-1.0.1.jar
kotlin-stdlib-1.3.72.jar
kotlin-stdlib-common-1.3.72.jar
memory-0.9.0.jar
netty-3.10.6.Final.jar
nimbus-jose-jwt-8.19.jar
orc-core-1.5.8.jar
orc-shims-1.5.8.jar
re2j-1.1.jar
reload4j-1.2.22.jar
scala-compiler-2.12.17.jar
scala-java8-compat_2.12-0.8.0.jar
scala-library-2.12.17.jar
scala-parser-combinators_2.12-1.1.1.jar
scala-reflect-2.12.17.jar
scala-xml_2.12-2.1.0.jar
scalap-2.12.17.jar
scopt_2.12-3.5.0.jar
servlet-api-2.5.jar
sketches-core-0.9.0.jar
slf4j-reload4j-1.7.36.jar
snappy-java-1.1.8.2.jar
ssl-config-core_2.12-0.3.7.jar
token-provider-1.0.1.jar
woodstox-core-5.3.0.jar
dnsjava-2.1.7.jar
esri-geometry-api-2.0.0.jar
flink-clients_2.12-1.12.2.jar
flink-connector-hive_2.12-1.12.2.jar
flink-connector-kafka_2.12-1.12.2.jar
flink-optimizer_2.12-1.12.2.jar
flink-runtime_2.12-1.12.2.jar
flink-scala_2.12-1.12.2.jar
flink-sql-client_2.12-1.12.2.jar
flink-streaming-java_2.12-1.12.2.jar
flink-streaming-scala_2.12-1.12.2.jar
flink-table-api-java-bridge_2.12-1.12.2.jar
flink-table-api-scala-bridge_2.12-1.12.2.jar
flink-table-api-scala_2.12-1.12.2.jar
flink-table-planner-blink_2.12-1.12.2.jar
flink-table-runtime-blink_2.12-1.12.2.jar
flink-yarn_2.12-1.12.2.jar
grizzled-slf4j_2.12-1.3.2.jar
guice-4.0.jar
guice-servlet-4.0.jar
hadoop-annotations-3.3.4.jar
hadoop-auth-3.3.4.jar
hadoop-client-3.3.4.jar
hadoop-common-3.3.4.jar
hadoop-hdfs-2.4.1.jar
hadoop-hdfs-2.7.1.jar
hadoop-hdfs-3.3.4.jar
hadoop-hdfs-client-3.3.4.jar
hadoop-mapreduce-client-common-3.3.4.jar
hadoop-mapreduce-client-core-3.3.4.jar
hadoop-mapreduce-client-jobclient-3.3.4.jar
hadoop-shaded-guava-1.1.1.jar
hadoop-shaded-protobuf_3_7-1.1.1.jar
hadoop-yarn-api-3.3.4.jar
hadoop-yarn-client-3.3.4.jar
hadoop-yarn-common-3.3.4.jar
hadoop-yarn-registry-3.1.0.jar
hive-classification-3.1.3.jar
hive-common-3.1.3.jar
hive-exec-3.1.3.jar
hive-llap-client-3.1.3.jar
hive-llap-common-3.1.3.jar
hive-llap-tez-3.1.3.jar
hive-storage-api-2.7.0.jar
hive-upgrade-acid-3.1.3.jar
hive-vector-code-gen-3.1.3.jar
jackson-jaxrs-base-2.13.4.jar
jackson-jaxrs-json-provider-2.13.4.jar
jackson-module-jaxb-annotations-2.13.4.jar
jackson-module-scala_2.12-2.13.4.jar
jasper-runtime-5.5.23.jar
javax.servlet-api-4.0.1.jar
jcip-annotations-1.0-1.jar
jetty-6.1.26.jar
jetty-rewrite-9.4.48.v20220622.jar
jetty-util-6.1.26.jar
jline-3.9.0.jar
joda-time-2.10.10.jar
joda-time-2.9.9.jar
json-smart-2.3.1.jar
json4s-ast_2.12-3.7.0-M11.jar
json4s-core_2.12-3.7.0-M11.jar
json4s-jackson_2.12-3.7.0-M11.jar
json4s-scalap_2.12-3.7.0-M11.jar
kerb-admin-1.0.1.jar
kerb-client-1.0.1.jar
kerb-common-1.0.1.jar
kerb-core-1.0.1.jar
kerb-crypto-1.0.1.jar
kerb-identity-1.0.1.jar
kerb-server-1.0.1.jar
kerb-simplekdc-1.0.1.jar
kerb-util-1.0.1.jar
kerby-asn1-1.0.1.jar
kerby-config-1.0.1.jar
kerby-pkix-1.0.1.jar