Skip to content

debug #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@
# the License.

# This workflow will build a Java project with Maven
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into develop
# Fors more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into developss
name: Build with unit tests

on:
Expand All @@ -23,7 +23,7 @@ on:

jobs:
build:
runs-on: k8s-runner-build
runs-on: self-hosted

if: ${{ github.event.workflow_run.conclusion != 'skipped' }}

Expand Down
51 changes: 26 additions & 25 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
# License for the specific language governing permissions and limitations under
# the License.

# This workflow will build a Java project with Maven
# This workflow will build a Java project with Mavens
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into develop
name: Build e2e tests
Expand All @@ -24,7 +24,7 @@ on:

jobs:
build:
runs-on: k8s-runner-e2e
runs-on: self-hosted
# We allow builds:
# 1) When triggered manually
# 2) When it's a merge into a branch
Expand Down Expand Up @@ -63,8 +63,9 @@ jobs:
- name: Checkout e2e test repo
uses: actions/checkout@v3
with:
repository: cdapio/cdap-e2e-tests
repository: cloudsufi/cdap-e2e-tests
path: e2e
ref: oracle_debug

- name: Cache
uses: actions/cache@v3
Expand All @@ -79,28 +80,28 @@ jobs:
uses: 'google-github-actions/get-secretmanager-secrets@v0'
with:
secrets: |-
ORACLE_HOST:cdapio-github-builds/ORACLE_HOST
ORACLE_USERNAME:cdapio-github-builds/ORACLE_USERNAME
ORACLE_PASSWORD:cdapio-github-builds/ORACLE_PASSWORD
ORACLE_PORT:cdapio-github-builds/ORACLE_PORT
MSSQL_HOST:cdapio-github-builds/MSSQL_HOST
MSSQL_USERNAME:cdapio-github-builds/MSSQL_USERNAME
MSSQL_PASSWORD:cdapio-github-builds/MSSQL_PASSWORD
MSSQL_PORT:cdapio-github-builds/MSSQL_PORT
MYSQL_HOST:cdapio-github-builds/MYSQL_HOST
MYSQL_USERNAME:cdapio-github-builds/MYSQL_USERNAME
MYSQL_PASSWORD:cdapio-github-builds/MYSQL_PASSWORD
MYSQL_PORT:cdapio-github-builds/MYSQL_PORT
POSTGRESQL_HOST:cdapio-github-builds/POSTGRESQL_HOST
POSTGRESQL_USERNAME:cdapio-github-builds/POSTGRESQL_USERNAME
POSTGRESQL_PASSWORD:cdapio-github-builds/POSTGRESQL_PASSWORD
POSTGRESQL_PORT:cdapio-github-builds/POSTGRESQL_PORT
CLOUDSQL_POSTGRESQL_USERNAME:cdapio-github-builds/CLOUDSQL_POSTGRESQL_USERNAME
CLOUDSQL_POSTGRESQL_PASSWORD:cdapio-github-builds/CLOUDSQL_POSTGRESQL_PASSWORD
CLOUDSQL_POSTGRESQL_CONNECTION_NAME:cdapio-github-builds/CLOUDSQL_POSTGRESQL_CONNECTION_NAME
CLOUDSQL_MYSQL_USERNAME:cdapio-github-builds/CLOUDSQL_MYSQL_USERNAME
CLOUDSQL_MYSQL_PASSWORD:cdapio-github-builds/CLOUDSQL_MYSQL_PASSWORD
CLOUDSQL_MYSQL_CONNECTION_NAME:cdapio-github-builds/CLOUDSQL_MYSQL_CONNECTION_NAME
ORACLE_HOST:cdf-entcon/ORACLE_HOST
ORACLE_USERNAME:cdf-entcon/ORACLE_USERNAME
ORACLE_PASSWORD:cdf-entcon/ORACLE_PASSWORD
ORACLE_PORT:cdf-entcon/ORACLE_PORT
MSSQL_HOST:cdf-entcon/MSSQL_HOST
MSSQL_USERNAME:cdf-entcon/MSSQL_USERNAME
MSSQL_PASSWORD:cdf-entcon/MSSQL_PASSWORD
MSSQL_PORT:cdf-entcon/MSSQL_PORT
MYSQL_HOST:cdf-entcon/MYSQL_HOST
MYSQL_USERNAME:cdf-entcon/MYSQL_USERNAME
MYSQL_PASSWORD:cdf-entcon/MYSQL_PASSWORD
MYSQL_PORT:cdf-entcon/MYSQL_PORT
POSTGRESQL_HOST:cdf-entcon/POSTGRESQL_HOST
POSTGRESQL_USERNAME:cdf-entcon/POSTGRESQL_USERNAME
POSTGRESQL_PASSWORD:cdf-entcon/POSTGRESQL_PASSWORD
POSTGRESQL_PORT:cdf-entcon/POSTGRESQL_PORT
CLOUDSQL_POSTGRESQL_USERNAME:cdf-entcon/CLOUDSQL_POSTGRESQL_USERNAME
CLOUDSQL_POSTGRESQL_PASSWORD:cdf-entcon/CLOUDSQL_POSTGRESQL_PASSWORD
CLOUDSQL_POSTGRESQL_CONNECTION_NAME:cdf-entcon/CLOUDSQL_POSTGRESQL_CONNECTION_NAME
CLOUDSQL_MYSQL_USERNAME:cdf-entcon/CLOUDSQL_MYSQL_USERNAME
CLOUDSQL_MYSQL_PASSWORD:cdf-entcon/CLOUDSQL_MYSQL_PASSWORD
CLOUDSQL_MYSQL_CONNECTION_NAME:cdf-entcon/CLOUDSQL_MYSQL_CONNECTION_NAME

- name: Run required e2e tests
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' && steps.filter.outputs.e2e-test == 'false'
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/trigger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ on:

jobs:
trigger:
runs-on: ubuntu-latest
runs-on: self-hosted

# We allow builds:
# 1) When triggered manually
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ fetchSize=1000
NumSplits=1
SplitBy=ID
jdbcURL=jdbc:mysql:///%s?cloudSqlInstance=%s&socketFactory=com.google.cloud.sql.mysql.SocketFactory&user=%s&password=%s
projectId=cdf-athena
projectId=cdf-entcon
datasetprojectId=cdf-athena
BQReferenceName=reference
targetTable=mytable5
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ splitByColumn=ID
importQuery=where $CONDITIONS

#bq properties
projectId=cdf-athena
projectId=cdf-entcon
dataset=test_automation
bqOutputMultipleDatatypesSchema=[{"key":"col1","value":"bytes"},{"key":"col2","value":"string"},\
{"key":"col3","value":"date"},{"key":"col4","value":"double"},{"key":"col5","value":"decimal"},\
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ zeroValue=0
blankvalue=
invalidTableName=Table123@
bqtarget.table=target-table
projectId=cdf-athena
datasetprojectId=cdf-athena
projectId=cdf-entcon
datasetprojectId=cdf-entcon
dataset=SQL_SERVER_TEST
invalid.boundQuery=SELECT MIN(id),MAX(id) FROM table
splitby=id
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,12 @@ invalid.tablename=123#
invalid.referenceName=abc#
fetchsize=1000
no.of.splits=1
projectId=cdf-athena
projectId=cdf-entcon
#invalid properties
invalid.host=localhost1
bqTargetTable=target-table
bqSourceTable=dummy
datasetprojectId=cdf-athena
datasetprojectId=cdf-entcon
dataset=test_automation
table.name=mysqltable
importQuery=select * from testTable where $CONDITIONS;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
@Oracle
Feature: Oracle - Verify Oracle source data transfer for multiple datatypes
@ORACLE_SOURCE_DATATYPES_TEST @ORACLE_TARGET_DATATYPES_TEST @Oracle_Required
# Oracle Sanity test to transfer table data containing multiple datatypes
# Oracle Sanitys test to transfer table data containing multiple datatypes
Scenario: To verify data is getting transferred from Oracle to Oracle successfully
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
@Oracle
Feature: Oracle - Verify Oracle source data transfer of type LONG
@ORACLE_SOURCE_DATATYPES_TEST2 @ORACLE_TARGET_DATATYPES_TEST2 @Oracle_Required
# Oracle Sanity test to transfer table data containing LONG
# Oracles Sanity test to transfer table data containing LONG
Scenario: To verify data is getting transferred from Oracle to Oracle successfully
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
@Oracle
Feature: Oracle - Verify Oracle source data transfer of type LONG VARCHAR
@ORACLE_SOURCE_DATATYPES_TEST4 @ORACLE_TARGET_DATATYPES_TEST4 @Oracle_Required
# Oracle Sanity test to transfer table data containing LONG VARCHAR
# Oracles Sanitysss test to transfer table data containing LONG VARCHAR
Scenario: To verify data is getting transferred from Oracle to Oracle successfully
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
Expand Down
2 changes: 1 addition & 1 deletion oracle-plugin/src/e2e-test/features/oracle/Oracle.feature
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# Unless required by applicable law or agreed to in writing, softwares
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
#

@Oracle @Oracle_Required
Feature: Oracle source- Verify Oracle source plugin design time scenarios
Feature: Oracle source- Verify Oracle source plugin design time scenarioss

@ORACLE_SOURCE_TEST
Scenario: To verify Oracle source plugin validation with connection and basic details for connectivity
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
driverName=oracle
databaseName=ORCLPDB1
databaseName=ORCLCDB
sourceRef=source
targetRef=target
schema=CDAP
schema=c##builduser
host=ORACLE_HOST
port=ORACLE_PORT
username=ORACLE_USERNAME
Expand Down Expand Up @@ -95,7 +95,7 @@ splitByColumn=ID
importQuery=where $CONDITIONS

#bq properties
projectId=cdf-athena
projectId=cdf-entcon
dataset=test_automation
bqOutputDatatypesSchema=[{"key":"ID","value":"decimal"},{"key":"LASTNAME","value":"string"}]
jdbcUrl=jdbc:bigquery://https://www.googleapis.com/bigquery/v2:443;ProjectId=%s;OAuthType=3;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ zeroValue=0
splitByColumn=ID
importQuery=where $CONDITIONS
#bq properties
projectId=cdf-athena
projectId=cdf-entcon
dataset=test_automation
bqOutputMultipleDatatypesSchema=[{"key":"col1","value":"bytes"},{"key":"col2","value":"string"},\
{"key":"col3","value":"date"},{"key":"col4","value":"double"},{"key":"col5","value":"decimal"},\
Expand Down