Skip to content

Commit 46b7c59

Browse files
macro validation
1 parent def8ed2 commit 46b7c59

3 files changed

Lines changed: 141 additions & 31 deletions

File tree

cloudsql-mysql-plugin/src/e2e-test/features/source/CloudMySqlRunTimeMacro.feature

Lines changed: 122 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
Feature: CloudMySql - Verify CloudMySql plugin data transfer with macro arguments
22

3-
@ORACLE_SOURCE_TEST @ORACLE_SINK_TEST
43
Scenario: To verify data is getting transferred from CloudMySql to CloudMySql successfully using macro arguments in connection section
54
Given Open Datafusion Project to configure pipeline
65
When Expand Plugin group in the LHS plugins list: "Source"
@@ -27,17 +26,18 @@ Feature: CloudMySql - Verify CloudMySql plugin data transfer with macro argument
2726
Then Click on the Macro button of Property: "password" and set the value to: "password"
2827
Then Enter input plugin property: "referenceName" with value: "RefName"
2928
Then Replace input plugin property: "database" with value: "databaseName"
30-
Then Enter input plugin property: "tableName" with value: "mytable"
29+
Then Enter input plugin property: "CloudMySqlImportQuery" with value: "mytable"
3130
# Then Validate "CloudSQL MySQL2" plugin properties
3231
Then Close the Plugin Properties page
3332
Then Save the pipeline
3433
Then Preview and run the pipeline
3534
Then Enter runtime argument value "driver" for key "cloudsql-mysql"
3635
Then Enter runtime argument value from environment variable "name" for key "username"
3736
Then Enter runtime argument value from environment variable "pass" for key "password"
37+
Then Enter runtime argument value "CloudMySqlImportQuery" for key "CloudMySqlImportQuery"
3838
Then Run the preview of pipeline with runtime arguments
39-
Then Wait till pipeline preview is in running state
40-
Then Open and capture pipeline preview logs
39+
# Then Wait till pipeline preview is in running state
40+
# Then Open and capture pipeline preview logs
4141
# Then Verify the preview run status of pipeline in the logs is "succeeded"
4242
# Then Close the pipeline logs
4343
# Then Close the preview
@@ -109,42 +109,133 @@ Feature: CloudMySql - Verify CloudMySql plugin data transfer with macro argument
109109
When Select plugin: "CloudSQL MySQL" from the plugins list as: "Sink"
110110
Then Connect plugins: "CloudSQL MySQL" and "CloudSQL MySQL2" to establish connection
111111
Then Navigate to the properties page of plugin: "CloudSQL MySQL"
112-
Then Click on the Macro button of Property: "select-jdbcPluginName" and set the value to: "cloudsql-mysql"
112+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "cloudsql-mysql"
113113
Then Select radio button plugin property: "instanceType" with value: "public"
114114
Then Enter input plugin property: "connectionName" with value: "cdf-athena:us-central1:sql-automation-test-instance"
115115
Then Click on the Macro button of Property: "user" and set the value to: "username"
116116
Then Click on the Macro button of Property: "password" and set the value to: "password"
117117
Then Enter input plugin property: "referenceName" with value: "RefName"
118118
Then Replace input plugin property: "database" with value: "databaseName"
119+
Then Click on the Macro button of Property: "importQuery" and set the value in textarea: "CloudMySqlImportQuery"
120+
# Then Validate "CloudSQL MySQL" plugin properties
121+
Then Close the Plugin Properties page
122+
Then Navigate to the properties page of plugin: "CloudSQL MySQL2"
123+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "cloudsql-mysql"
124+
Then Select radio button plugin property: "instanceType" with value: "public"
125+
Then Enter input plugin property: "connectionName" with value: "cdf-athena:us-central1:sql-automation-test-instance"
126+
Then Click on the Macro button of Property: "user" and set the value to: "username"
127+
Then Click on the Macro button of Property: "password" and set the value to: "password"
128+
Then Enter input plugin property: "referenceName" with value: "RefName"
129+
Then Click on the Macro button of Property: "tableName" and set the value to: "mytable"
130+
Then Replace input plugin property: "database" with value: "databaseName"
131+
# Then Validate "CloudSQL MySQL2" plugin properties
132+
Then Close the Plugin Properties page
133+
Then Save the pipeline
134+
Then Preview and run the pipeline
135+
Then Enter runtime argument value "invalidTable" for key "mytable"
136+
Then Enter runtime argument value "invalidUserName" for key "username"
137+
Then Enter runtime argument value "invalidPassword" for key "password"
138+
Then Enter runtime argument value "invalidImportQuery" for key "CloudMySqlImportQuery"
139+
Then Run the preview of pipeline with runtime arguments
140+
Then Verify the preview of pipeline is "Failed"
119141

120-
Then Click on the Macro button of Property: "database" and set the value to: "oracleDatabase"
121-
Then Select radio button plugin property: "connectionType" with value: "service"
122-
Then Select radio button plugin property: "role" with value: "sysdba"
123-
Then Enter input plugin property: "referenceName" with value: "sourceRef"
124-
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
125-
Then Validate "Oracle" plugin properties
142+
Scenario: To verify pipeline preview fails when invalid basic details provided using macro arguments
143+
Given Open Datafusion Project to configure pipeline
144+
When Expand Plugin group in the LHS plugins list: "Source"
145+
When Select plugin: "CloudSQL MySQL" from the plugins list as: "Source"
146+
When Expand Plugin group in the LHS plugins list: "Sink"
147+
When Select plugin: "CloudSQL MySQL" from the plugins list as: "Sink"
148+
Then Connect plugins: "CloudSQL MySQL" and "CloudSQL MySQL2" to establish connection
149+
Then Navigate to the properties page of plugin: "CloudSQL MySQL"
150+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "cloudsql-mysql"
151+
Then Select radio button plugin property: "instanceType" with value: "public"
152+
Then Enter input plugin property: "connectionName" with value: "cdf-athena:us-central1:sql-automation-test-instance"
153+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
154+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
155+
Then Enter input plugin property: "referenceName" with value: "RefName"
156+
Then Replace input plugin property: "database" with value: "databaseName"
157+
Then Click on the Macro button of Property: "importQuery" and set the value in textarea: "CloudMySqlImportQuery"
158+
# Then Validate "CloudSQL MySQL" plugin properties
126159
Then Close the Plugin Properties page
127-
Then Navigate to the properties page of plugin: "Oracle2"
128-
Then Click on the Macro button of Property: "jdbcPluginName" and set the value to: "oracleDriverName"
129-
Then Click on the Macro button of Property: "host" and set the value to: "oracleHost"
130-
Then Click on the Macro button of Property: "port" and set the value to: "oraclePort"
131-
Then Click on the Macro button of Property: "user" and set the value to: "oracleUsername"
132-
Then Click on the Macro button of Property: "password" and set the value to: "oraclePassword"
133-
Then Click on the Macro button of Property: "database" and set the value to: "oracleDatabase"
134-
Then Enter input plugin property: "referenceName" with value: "targetRef"
135-
Then Replace input plugin property: "tableName" with value: "targetTable"
136-
Then Replace input plugin property: "dbSchemaName" with value: "schema"
137-
Then Select radio button plugin property: "connectionType" with value: "service"
138-
Then Select radio button plugin property: "role" with value: "sysdba"
139-
Then Validate "Oracle2" plugin properties
160+
Then Navigate to the properties page of plugin: "CloudSQL MySQL2"
161+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "cloudsql-mysql"
162+
Then Select radio button plugin property: "instanceType" with value: "public"
163+
Then Enter input plugin property: "connectionName" with value: "cdf-athena:us-central1:sql-automation-test-instance"
164+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
165+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
166+
Then Enter input plugin property: "referenceName" with value: "RefName"
167+
Then Click on the Macro button of Property: "tableName" and set the value to: "mytable"
168+
Then Replace input plugin property: "database" with value: "databaseName"
169+
# Then Validate "CloudSQL MySQL2" plugin properties
170+
Then Close the Plugin Properties page
171+
Then Save the pipeline
172+
Then Preview and run the pipeline
173+
Then Enter runtime argument value "invalidImportQuery" for key "CloudMySqlImportQuery"
174+
Then Enter runtime argument value "invalidTable" for key "mytable"
175+
Then Run the preview of pipeline with runtime arguments
176+
Then Verify the preview of pipeline is "Failed"
177+
178+
Scenario: To verify data is getting transferred from CloudMySql source to BigQuery sink using macro arguments
179+
Given Open Datafusion Project to configure pipeline
180+
When Expand Plugin group in the LHS plugins list: "Source"
181+
When Select plugin: "CloudSQL MySQL" from the plugins list as: "Source"
182+
When Expand Plugin group in the LHS plugins list: "Sink"
183+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
184+
Then Connect plugins: "CloudSQL MySQL" and "BigQuery" to establish connection
185+
Then Navigate to the properties page of plugin: "CloudSQL MySQL"
186+
Then Click on the Macro button of Property: "jdbcPluginName" and set the value to: "CloudMySqlDriverName"
187+
Then Select radio button plugin property: "instanceType" with value: "public"
188+
Then Enter input plugin property: "connectionName" with value: "cdf-athena:us-central1:sql-automation-test-instance"
189+
Then Click on the Macro button of Property: "user" and set the value to: "username"
190+
Then Click on the Macro button of Property: "password" and set the value to: "password"
191+
Then Enter input plugin property: "referenceName" with value: "RefName"
192+
Then Replace input plugin property: "database" with value: "databaseName"
193+
Then Click on the Macro button of Property: "importQuery" and set the value in textarea: "CloudMySqlImportQuery"
194+
# Then Validate "CloudSQL MySQL" plugin properties
195+
Then Close the Plugin Properties page
196+
Then Navigate to the properties page of plugin: "BigQuery"
197+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
198+
Then Click on the Macro button of Property: "project" and set the value to: "projectId"
199+
Then Click on the Macro button of Property: "datasetProject" and set the value to: "bqDatasetId"
200+
Then Click on the Macro button of Property: "dataset" and set the value to: "dataset"
201+
Then Click on the Macro button of Property: "table" and set the value to: "bqSourceTable"
202+
Then Click on the Macro button of Property: "truncateTable" and set the value to: "bqTruncateTable"
203+
Then Click on the Macro button of Property: "allowSchemaRelaxation" and set the value to: "bqUpdateTableSchema"
204+
# Then Validate "BigQuery" plugin properties
140205
Then Close the Plugin Properties page
141206
Then Save the pipeline
142207
Then Preview and run the pipeline
143-
Then Enter runtime argument value "invalidDriverName" for key "oracleDriverName"
144-
Then Enter runtime argument value "invalidHost" for key "oracleHost"
145-
Then Enter runtime argument value "invalidPort" for key "oraclePort"
146-
Then Enter runtime argument value "invalidUserName" for key "oracleUsername"
147-
Then Enter runtime argument value "invalidPassword" for key "oraclePassword"
148-
Then Enter runtime argument value "invalidDatabaseName" for key "oracleDatabase"
208+
Then Enter runtime argument value "CloudMySqlDriverName" for key "CloudMySqlDriverName"
209+
Then Enter runtime argument value from environment variable "name" for key "username"
210+
Then Enter runtime argument value from environment variable "pass" for key "password"
211+
Then Enter runtime argument value "CloudMySqlImportQuery" for key "CloudMySqlImportQuery"
212+
Then Enter runtime argument value "projectId" for key "projectId"
213+
Then Enter runtime argument value "projectId" for key "bqDatasetId"
214+
Then Enter runtime argument value "dataset" for key "dataset"
215+
Then Enter runtime argument value "bqSourceTable" for key "bqSourceTable"
216+
Then Enter runtime argument value "bqTargetTable" for key "bqTruncateTable"
217+
Then Enter runtime argument value "bqTargetTable" for key "bqUpdateTableSchema"
149218
Then Run the preview of pipeline with runtime arguments
150-
Then Verify the preview of pipeline is "Failed"
219+
Then Wait till pipeline preview is in running state
220+
Then Open and capture pipeline preview logs
221+
# Then Verify the preview run status of pipeline in the logs is "succeeded"
222+
# Then Close the pipeline logs
223+
# Then Close the preview
224+
# Then Deploy the pipeline
225+
# Then Run the Pipeline in Runtime
226+
# Then Enter runtime argument value "CloudMySqlDriverName" for key "CloudMySqlDriverName"
227+
# Then Enter runtime argument value from environment variable "name" for key "username"
228+
# Then Enter runtime argument value from environment variable "pass" for key "password"
229+
# Then Enter runtime argument value "CloudMySqlImportQuery" for key "CloudMySqlImportQuery"
230+
# Then Enter runtime argument value "projectId" for key "projectId"
231+
# Then Enter runtime argument value "projectId" for key "bqDatasetId"
232+
# Then Enter runtime argument value "dataset" for key "dataset"
233+
# Then Enter runtime argument value "bqSourceTable" for key "bqSourceTable"
234+
# Then Enter runtime argument value "bqTargetTable" for key "bqTruncateTable"
235+
# Then Enter runtime argument value "bqTargetTable" for key "bqUpdateTableSchema"
236+
# Then Run the Pipeline in Runtime with runtime arguments
237+
# Then Wait till pipeline is in running state
238+
# Then Open and capture logs
239+
# Then Verify the pipeline status is "Succeeded"
240+
# Then Close the pipeline logs
241+
# Then Validate OUT record count is equal to records transferred to target BigQuery table
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
package io.cdap.plugin.CloudMySql;
2+
3+
import io.cdap.e2e.utils.PluginPropertyUtils;
4+
5+
import java.sql.Connection;
6+
import java.sql.DriverManager;
7+
import java.sql.SQLException;
8+
import java.util.TimeZone;
9+
10+
public class CloudMySqlClient {
11+
}
12+

cloudsql-mysql-plugin/src/e2e-test/resources/pluginParameters.properties

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,13 @@ driver=cloudsql-mysql
1919
table=myTable
2020
name=NAME
2121
pass=PASS
22+
invalidUserName=testUser
23+
invalidPassword=testPassword
24+
invalidTable=data
25+
CloudMySqlDriverName=cloudsql-mysql
26+
bqTruncateTable=truncateTable
27+
bqUpdateTableSchema=updateSchema
28+
invalidDatabaseName=invalidDB%$^%*
2229
outputDatatypesSchema2=[{"key":"ID","value":"string"},{"key":"COL1","value":"string"},{"key":"COL2","value":"bytes"},\
2330
{"key":"COL3","value":"bytes"},{"key":"COL4","value":"string"},{"key":"COL5","value":"string"},\
2431
{"key":"COL6","value":"bytes"}]

0 commit comments

Comments
 (0)