Skip to content

[Feature][e2e] Resource Center test cases #7312

@janeHe13

Description

@janeHe13

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Description

Task status number function module test point priority service test steps expected results actual results remarks
  • finish
001 file management create folder P1 start API service, Hadoop file management service 1 Enter the file management page and click the "create folder" button to enter the create folder page
2 Enter folder name and description
3 Click the "create" button
successfully create the folder
  • finish
002 file management create multi-level folders P1 start the API service, Hadoop file management service click the folder and create a folder or file under the folder < br > 1 Successfully created folder or file under folder
2 Folders cannot be edited or downloaded
  • finish
003 file management cancel folder creation P2 start API service and Hadoop file management service 1 Click create folder and enter the folder name and description
2 Click the "Cancel" button
cancel the creation of the folder
  • finish
004 file management folder renaming P1 start API service, Hadoop file management service 1 Click the "Rename" button of the folder to pop up the rename pop-up box
2 Enter a name and description
3 Click the "Rename" button
1 Folder renamed successfully
2 If the file under the folder is referenced by the workflow task, the folder name in the task is the renamed name
3 The folder name of authorized resources in the user center is the renamed name
  • finish
005 file management cancel folder renaming P2 start API service and Hadoop file management service 1 Click the "Rename" button of the folder to pop up the rename pop-up box
2 Enter a name and description
3 Click the "Cancel" button
cancel the folder renaming, and the folder name remains unchanged
  • finish
006 file management delete folder P1 start API service, Hadoop file management service click the "delete" button of folder and click the "OK" button 1 If the file under the folder is referenced by the online workflow, the folder cannot be deleted
2 The file under the folder is not referenced by the task or the referenced workflow has been offline. The folder has been deleted successfully
  • finish
007 file management cancel deleting the folder P2 start the API service, Hadoop file management service click the "delete" button of the folder and click "Cancel" Button don’t delete folder
  • finish
008 file management create file P1 start API service, Hadoop file management service 1 Enter the file management page and click the "create file" button to enter the create file page
2 Edit file name, description (not required), file content and select file format
3 Click the "create" button
1 Successfully created file, t_ ds_ Add a new data item in the resources table, type = 0
2 Add a new file in the resource under HDFS # or S3 # distributed file systems
3 If the creating user is admin and there is no associated tenant, the file cannot be created; The admin user can create a file after associating with the tenant
  • finish
009 file management create a file P1 start the API service and Hadoop file management service 1 Enter the file management page, select a folder, and click the "create file" button to enter the create file page
2 Edit file name, description (not required), file content and select file format
3 Click the "create" button
1 Successfully created file, t_ ds_ Add a new data item in the resources table, type = 0
2 Add a / folder / file in the resource under HDFS # or S3 # distributed file systems
3 If the creating user is admin and there is no associated tenant, the file cannot be created; The admin user can create a file after associating with the tenant
  • finish
010 file management cancel file creation P2 start API service, Hadoop file management service 1 Enter the file management page and click the "create file" button to enter the create file page
2 Edit file name, description (not required), file content and select file format
3 Click the "Cancel" button
cancel the creation of the file
  • finish
011 file management ordinary users upload files, less than 1G P1 start API service and Hadoop file management service 1 Ordinary users enter the file management page and click the "upload file" button to pop up a pop-up box
2 Upload files smaller than 1G
3 Click the "submit" button
1. File uploaded successfully, t_ ds_ Add a new piece of data in the resources table, type = 0
2. Add a new file in the resource under HDFS or S3 distributed file systems
  • finish
012 file management ordinary users upload files greater than 1G P2 start API service and Hadoop file management service 1 Ordinary users enter the file management page and click the "upload file" button to pop up a pop-up box
2 Upload files larger than 1G
3 Click the "submit" button < br >
file upload fails, and a prompt
  • finish
013 file management admin users upload files, less than 1G P1 start API service and Hadoop file management service 1 The administrator admin enters the file management page and clicks the "upload file" button to pop up a pop-up box
2 Upload files smaller than 1G
3 Click the "submit" button < br >
1 If the administrator is associated with the tenant and the file is uploaded successfully, t_ ds_ Add a new data item in the resources table, type = 0
2 Add a new file in the resource under HDFS # or S3 # distributed file systems
3 If the administrator is not associated with a tenant and cannot upload files, a prompt will pop up
  • finish
014 file management admin users upload files that are greater than 1G P1 start the API service, Hadoop file management service ordinary users enter the file management page, select a folder, and click "upload files" Successfully uploaded the file under this folder
  • finish
015 file management cancel uploading files P3 start API service and Hadoop file management service 1 Enter the file management page and click the "upload file" button to pop up a pop-up box
2 To upload a file less than 1G, click the "Cancel" button
cancel uploading the file
  • finish
016 file management edit file P1 start API service, Hadoop file management service 1 Enter the file management page, click the "Edit" button or the file name link to enter the editing page
2 Edit file content
3 Click the "save" button
save the edited file content
  • finish
017 file management cancel file editing P3 start API service, Hadoop file management service 1 Enter the document management page and click the "Edit" button to enter the editing page
2 Edit file content
3 Click the "back" button
do not save the edited file content
  • finish
018 file management rename P1 start API service, Hadoop file management service 1 Enter the file management page and click the "Rename" button
2 Enter the name (required) and description (not required) in the "Rename" pop-up box
3 Click the "Rename" button < br >
the file is renamed successfully
  • finish
019 file management cancel renaming P3 start API service, Hadoop file management service 1 Enter the file management page and click the "Rename" button
2 Enter the name (required) and description (not required) in the "Rename" pop-up box
3 Click the "Cancel" button < br >
cancel renaming
  • finish
020 file management download file P1 start API service, Hadoop file management service enter the file management page and click "download" Button < br > download file
  • finish
021 file management delete file P1 start API service, Hadoop file management service enter the file management page, click "delete" button, click "OK" button < br > 1 If the file is associated with an online task, it cannot be deleted; If the file is not associated with the online task, it will be deleted t_ ds_ This file of the resources table
2 Delete the files in the resource under HDFS
or S3 distributed file systems
  • finish
022 file management cancel deleting files P3 start the API service, Hadoop file management service enter the file management page, click the "delete" button and click "Cancel" Button < br > do not delete file
  • finish
023 file management query file P1 start API service, Hadoop file management service 1 Enter the file name
2 Click the "query" button
the file name supports fuzzy query: < br > 1 Query no data, and the file list displays no data temporarily
2 Query the data, and the file list displays the file data correctly
  • finish
024 file management verify paging controls, no more than 10 P2 start API service and Hadoop file management service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • finish
025 file management verify paging controls, more than 10 P2 start API service, Hadoop file management service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • finish
026 file management verify the > button of paging control P2 start API service, Hadoop file management service click the > button, View the pagination display page and jump to the next page
  • finish
027 file management verify the < button of paging control P2 start API service, Hadoop file management service click the < button, View the pagination display page and jump to the previous page
  • finish
028 file management verify the input box of paging control P2 start API service, Hadoop file management service enter the number of pages, View pagination display page Jump to the page where the page number is entered
  • finish
029 UDF management - resource management upload UDF resources P1 start API service, Hadoop file management service 1 Enter the UDF management - resource management page, click the "upload UDF resources" button to upload files less than 1G
2 Click the "Upload" button to upload the file, and click the "confirm" button in the pop-up box
1 File uploaded successfully, t_ ds_ Add a new data item in the resources table, type = 1
2 Add a new file
in the resource under HDFS or
  • finish
030 P2 start API service, Hadoop file management service 1 Enter the UDF management - resource management page, click the "upload UDF resources" button to upload files larger than 1G
2 Click the "Upload" button to upload the file, and click the "confirm" button in the pop-up box
if the file upload fails, a prompt will pop up
  • finish
031 UDF management - resource management cancel uploading UDF files P3 start API service and Hadoop file management service 1 Enter the UDF management - resource management page, click the "upload UDF resources" button to upload files less than 1G
2 Click the "Upload" button, click the "Cancel" button in the pop-up window
cancel uploading UDF functions
  • finish
032 UDF management - resource management create folder P1 start API service, Hadoop file management service 1 Enter the UDF management - resource management page and click the "create folder" button
2 Enter folder name and description (not required) < br > 3 Click the "create" button to successfully create the folder
  • finish
033 UDF management - resource management cancel folder creation P3 start API service and Hadoop file management service 1 Enter the UDF management - resource management page and click the "create folder" button
2 Enter folder name and description (not required) < br > 3 Click the "Cancel" button
no folder has been created
  • finish
034 UDF management - upload UDF resources P1 start API service and Hadoop file management service 1 Enter the UDF management - resource management page, select a folder, upload UDF resources successfully upload UDF resources
  • finish
035 UDF management - resource management rename P1 start API service, Hadoop file management service 1 Enter UDF management - resource management page and click "Rename" button
2 Edit name (required), description (not required) < br > 2 Click the "Rename" button
1 t_ ds_ resources. file_ Name is updated to the new name
2 Change the file name in the resource under HDFS
or S3 distributed file systems to the new name
  • finish
036 UDF management - resource management cancel renaming P3 start API service, Hadoop file management service 1 Enter UDF management - resource management page and click "Rename" button
2 Edit name (required), description (not required) < br > 2 Click the "Cancel" button
UDF resource name will not be changed
  • finish
037 UDF management - resource management download P1 start the API service, Hadoop file management service enter the UDF management - resource management page and click "download" Button successfully download UDF resources
  • finish
038 UDF management - resource management delete P1 start the API service and Hadoop file management service enter the UDF management - resource management page and click "delete" Button to delete UDF resources, you need to judge whether there are UDF functions in use. If there are UDF functions, they cannot be deleted. If not, they can be deleted
  • finish
039 UDF management - resource management query P1 start the API service, Hadoop file management service enter the UDF management - resource management page, enter the UDF resource name, and click the "query" button
  • finish
040 UDF management - resource management verify paging controls, no more than 10 P2 start API service and Hadoop file management service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • finish
041 UDF management - resource management verify paging controls, more than 10 P2 start API service and Hadoop file management service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • finish
042 UDF management - resource management verify the > button of paging control P2 start API service, Hadoop file management service click the > button to view paging display page Jump to the next page
  • finish
043 UDF management - resource management verify the < button of paging control P2 start API service, Hadoop file management service click the < button to view paging display page Jump to the previous page
  • finish
044 UDF management - resource management verify the input box of the paging control P2 start the API service, Hadoop file management service enter the number of pages, view the paging display jump to the page where the page number is entered
  • finish
045 UDF management - function management create UDF function P1 start API service, Hadoop file management service 1 Enter the UDF function page and click "create UDF function" to pop up a pop-up box
2 Edit UDF function information
3 Select UDF resources or upload resources
4 Click the "submit" button
t_ ds_ A new data entry is added to the UDFs table, and the UDF function is verified in the hive SQL task < br >
  • finish
046 UDF management - function management cancel the creation of UDF function P3 start API service and Hadoop file management service 1 Enter the UDF function page and click "create UDF function" to pop up a pop-up box
2 Edit UDF function information
3 Select UDF resources or upload resources
4 Click the "submit" button
cancel the creation of UDF function
  • finish
047 UDF management - function management edit UDF function P1 start API service, Hadoop file management service 1 Enter the UDF function page and click "button" to pop up a pop-up box
2 Edit UDF function information
3 Select UDF resources or upload resources
4 Click the "Edit" button
t_ ds_ UDFs table update data
  • finish
048 UDF management - function management cancel editing UDF function P3 start API service and Hadoop file management service 1 Enter the UDF function page and click "button" to pop up a pop-up box
2 Edit UDF function information
3 Select UDF resources or upload resources
4 Click "Cancel" to cancel editing UDF function
  • finish
049 UDF management - function management delete UDF function P1 start API service, Hadoop file management service enter UDF function page, click "button", click "delete" button delete UDF function
  • finish
050 UDF management - function management query P1 start the API service, Hadoop file management service enter the UDF function page, enter the UDF function name, and click the "query" button UDF function name supports fuzzy query: < br > 1 Query no data, UDF function list displays no data
2 If the query has data, the UDF function list displays the data correctly
  • finish
051 UDF management - function management verify paging controls, no more than 10 P2 start API service, Hadoop file management service 1 Select 10 pieces / page, no more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page, no more than 30 pieces of data, and view the pagination display
3 Select 50 pieces / page, no more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page, no more than 10 pieces of data, and 1 page displays
2 Select 30 pieces / page, no more than 30 pieces of data, and 1 page displays
3 Select 50 pieces / page, no more than 50 pieces of data, and one page displays
  • finish
052 UDF management - function management verify paging controls, more than 10 P2 start API service, Hadoop file management service 1 Select 10 pieces / page, more than 10 pieces of data, and view the pagination display
2 Select 30 pieces / page and more than 30 pieces of data to view the pagination display
3 Select 50 pieces / page, more than 50 pieces of data, and view the pagination display
1 Select 10 pieces / page. If there are more than 10 pieces of data, turn the page to display
2 Select 30 pieces / page. If more than 30 pieces of data are selected, the page will turn to display
3 Select 50 pieces / page, and more than 50 pieces of data will be displayed on the page
  • finish
053 UDF management - function management verify the > button of paging control P2 start API service, Hadoop file management service click the > button to view paging display page Jump to the next page
  • finish
054 UDF management - function management verify the < button of paging control P2 start API service, Hadoop file management service click the < button to view paging display page Jump to the previous page
  • finish
055 UDF management - function management verify the input box of the paging control P2 start the API service, Hadoop file management service enter the number of pages, view the paging display jump to the page where the page number is entered

Use case

No response

Related issues

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    e2ee2e testfeaturenew featurehelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions