-
Notifications
You must be signed in to change notification settings - Fork 668
QNTM-1476: Upload JSON files to S3 #8136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| var jsonFolder = Path.Combine(tempPath, "json"); | ||
| var jsonNonGuidFolder = Path.Combine(tempPath, "jsonNonGuid"); | ||
|
|
||
| var jsonNonGuidFiles = Directory.GetFiles(jsonNonGuidFolder, "*.*", SearchOption.AllDirectories); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to either verify these paths exist or put these calls inside a try catch, Directory.GetFiles will throw if the directories don't exist... which they may not on a clean machine.
| { | ||
| foreach (var file in files) | ||
| { | ||
| File.Delete(file); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should probably be made more tolerant, assert that the files actually exist before you try this and check if the deletion succeeds, you may not have permissions when this call is made.
| modelsGuidToIdMap.Clear(); | ||
| DoWorkspaceOpenAndCompare(filePath, "json_nonGuidIds", ConvertCurrentWorkspaceToNonGuidJsonAndSave, CompareWorkspacesDifferentGuids, SaveWorkspaceComparisonDataWithNonGuidIds); | ||
|
|
||
| //UploadToS3() from Ram; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you will want to upload both the guid and non guid tests and all data files associated with those graphs so this should be called from the other serializationTest method as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, you may want to add a breakpoint here and check if this method is called repeatedly... I dont think you want to make 2000 separate upload calls to s3, I think you just want to upload the two folders once.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If Dynamo is going to upload to S3, then the configurations should be moved to the config files or environment variables. Uploading to S3 is "sync" ing a folder on the machine with S3 storage location. So, one sync will get all the files in S3.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ramramps @smangarole thats a good point... I dont think running these tests should upload to s3 - then what happens when a dev runs the test locally? It should not upload to s3. this should be some post test step or at least set with a flag or something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please make uploading part seperate of Dynamo code. For internal tool, we probably can argue about it. But this will be wrapped in DynamoTest nuget package and spread publicly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
another thought here is upload should be from the script (job) after a successful build.
|
|
||
| try | ||
| { | ||
| jsonFilesList = Directory.GetFiles(jsonNonGuidFolder, "*.*", SearchOption.AllDirectories).ToList(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if jsonFolder exists but jsonNonGuidFolder does not, this will prevent deletion of the jsonFolder files. These should probably be in separate try/catch blocks.
Can you use Directory.Delete(jsonNonGuidFolder, true) here instead of deleting the files individually?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
agree with @gregmarr 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current "delete files individually" method has the problem that as soon as you try to delete a file for which you don't have delete permission (or it's currently locked), then it aborts the entire operation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added individual try...catch for each folder.
| foreach (var file in files) | ||
| { | ||
| //Delete files if they exist | ||
| if (File.Exists(file)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This check isn't necessary (and the entire function may not be necessary if Directory.Delete works). If the file didn't exist, it wouldn't be in the list. If it did exist when the list was generated, but doesn't now, then it's still safe to call File.Delete on it, as deleting a non-existent file doesn't throw an exception.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. If I am using Directory.Delete, I can remove this method entirely.
| { | ||
| modelsGuidToIdMap.Clear(); | ||
| DoWorkspaceOpenAndCompare(filePath, "json_nonGuidIds", ConvertCurrentWorkspaceToNonGuidJsonAndSave, CompareWorkspacesDifferentGuids, SaveWorkspaceComparisonDataWithNonGuidIds); | ||
| DoWorkspaceOpenAndCompare(filePath, "json_nonGuidIds", ConvertCurrentWorkspaceToNonGuidJsonAndSave, CompareWorkspacesDifferentGuids, SaveWorkspaceComparisonDataWithNonGuidIds); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like a trailing whitespace only change
| var jsonNonGuidFolder = Path.Combine(tempPath, "jsonNonGuid"); | ||
|
|
||
| var jsonFilesList = new List<string>(); | ||
| var jsonNonGuidFilesList = new List<string>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These two variables are now obsolete.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, yes. Deleted!
Please Note:
DynamoRevitrepo will need to be cherry-picked into all the DynamoRevit Release branches that Dynamo supports. Contributors will be responsible for cherry-picking their reviewed commits to the other branches after aLGTMlabel is added to the PR.Purpose
The purpose of this PR is to Upload newly generated JSON files to S3. In this PR, I updated the TestFixtureSetup() to delete the files in the temp folder. This will enable us to upload a new set of files to s3 after every test run. This PR is a part of the testing tool that we are building
Declarations
Check these if you believe they are true
(https://github.com/DynamoDS/Dynamo/wiki/Dynamo-Versions), and are documented in the API Changes document.
Reviewers
@mjkkirschner @ramramps @QilongTang