Skip to content
This repository was archived by the owner on Sep 20, 2024. It is now read-only.

Deadline: set PATH environment in deadline jobs by GlobalJobPreLoad#5622

Merged
moonyuet merged 5 commits intodevelopfrom
enhancement/OP-6381_Update-deadline-GlobalJobPreLoad-to-update-job-env
Sep 21, 2023
Merged

Deadline: set PATH environment in deadline jobs by GlobalJobPreLoad#5622
moonyuet merged 5 commits intodevelopfrom
enhancement/OP-6381_Update-deadline-GlobalJobPreLoad-to-update-job-env

Conversation

@MustafaJafar
Copy link
Copy Markdown
Member

@MustafaJafar MustafaJafar commented Sep 14, 2023

Changelog Description

This PR makes GlobalJobPreLoad to set PATH environment in deadline jobs so that we don't have to use the full executable path for deadline to launch the dcc app.
This trick should save us adding logic to pass houdini patch version and modifying Houdini deadline plugin.
This trick should work with other DCCs

Example :

To make deadline call the desired Houdini executable, we should :

  1. update PATH environment in the app environment and add HOUDINI_VERSION environment
    image

  2. in deadline plugin settings only use executable name
    image

Additional Info

The idea of this PR was discussed here #5384

Testing notes:

  1. update your DCC settings in OpenPype, as screen shot above,
    Here are snippets for maya and houdini, but feel free to do the same with other DCCs
    for houdini, you can use

Don't forget to add HOUDINI_VERSION for each houdini version

{
    "PATH": [
        "C:/Program Files/Side Effects Software/Houdini {HOUDINI_VERSION}/bin",
        "{PATH}"
    ]
}

for maya, you can use

Don't forget to add MAYA_VERSION for each maya version

{
    "PATH": [
        "C:/Program Files/Autodesk/Maya{MAYA_VERSION}/bin",
        "{PATH}"
    ]
}
  1. In Deadline plugin settings, use only executable names
    for houdini, you can use
hython.exe
hython

for maya, you can use

maya.exe
maya
  1. publish jobs from different DCC versions

@MustafaJafar MustafaJafar added type: enhancement Enhancements to existing functionality module: Deadline AWS Deadline related features labels Sep 14, 2023
@ynbot
Copy link
Copy Markdown
Contributor

ynbot commented Sep 14, 2023

@ynbot ynbot added the size/XS Denotes a PR changes 0-99 lines, ignoring general files label Sep 14, 2023
Copy link
Copy Markdown
Collaborator

@BigRoy BigRoy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aside from code cosmetics the feature makes sense.

I wonder whether we should add a comment in the code in the if branch WHY we're doing it. E.g.

# Set os.environ[PATH] so studio settings' path entries
# can be used to define search path for executables.

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 14, 2023

Actually, one other note, can we add documentation about this use case? Somewhere in Admin Docs for Deadline we should explain how to configure things to allow Application Versions to be completely picked by settings using this method.

@moonyuet
Copy link
Copy Markdown
Member

moonyuet commented Sep 15, 2023

I followed your testing notes to submit the rendering. It fails rendering and gets the error on deadline. Do I need to add the Houdini 19_5_Hython_Executable as parameters first before doing what you mentioned in the testing notes? I guess I need to add the Houdini 19_5_Hython_Executable as parameters for sure.

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 15, 2023

I guess I need to add the Houdini 19_5_Hython_Executable as parameters for sure.

Yes, if your Deadline doesn't have the parameters required for a Houdini version to begin with it'll fail. Not sure how much Deadline still releases updates since it's gone "free" but I reckon you're best off manually adding the parameters if it's not available in an update yet.

@moonyuet
Copy link
Copy Markdown
Member

moonyuet commented Sep 15, 2023

I have added the 19.5 param into Houdini. And it still fails rendering but it successfully renders when it is with the full path.
I set the setting to the testing notes.

image
image

The log below is the error shown.

=======================================================
Error
=======================================================
Error: FailRenderException : Houdini 19_5 hython executable was not found in the semicolon separated list "hython.exe;hython;". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor.
   at Deadline.Plugins.DeadlinePlugin.FailRender(String message) (Python.Runtime.PythonException)
  File "C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\65047673c714d533149185d8\Houdini.py", line 99, in RenderExecutable
    self.FailRender( "Houdini " + version + " hython executable was not found in the semicolon separated list \"" + houdiniExeList + "\". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor." )
   at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
   at __FranticX_GenericDelegate0`1\[\[System_String\, System_Private_CoreLib\, Version=4_0_0_0\, Culture=neutral\, PublicKeyToken=7cec85d7bea7798e\]\]Dispatcher.Invoke()
   at FranticX.Processes.ManagedProcess.RenderExecutable()
   at Deadline.Plugins.DeadlinePlugin.RenderExecutable()
   at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit)
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)

=======================================================
Type
=======================================================
RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgj, CancellationToken bgk)
   at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajq, CancellationToken ajr)

=======================================================
Log
=======================================================
2023-09-15 23:21:46:  0: Render Thread - Render State transition from = 'ReceivedTask' to = 'Other'
2023-09-15 23:21:46:  0: Loading Job's Plugin timeout is Disabled
2023-09-15 23:21:46:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'Kayla'
2023-09-15 23:21:48:  0: Loaded plugin Houdini
2023-09-15 23:21:48:  0: Executing plugin command of type 'Initialize Plugin'
2023-09-15 23:21:48:  0: INFO: Executing plugin script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\65047673c714d533149185d8\Houdini.py'
2023-09-15 23:21:48:  0: INFO: Plugin execution sandbox using Python version 3
2023-09-15 23:21:48:  0: INFO: About: Houdini Plugin for Deadline
2023-09-15 23:21:48:  0: INFO: The job's environment will be merged with the current environment before rendering
2023-09-15 23:21:48:  0: Done executing plugin command of type 'Initialize Plugin'
2023-09-15 23:21:48:  0: Start Job timeout is disabled.
2023-09-15 23:21:48:  0: Task timeout is disabled.
2023-09-15 23:21:48:  0: Loaded job: new_project_ep_asset_v220.hiplc - redshift_ropMain (65047673c714d533149185d8)
2023-09-15 23:21:48:  0: Executing plugin command of type 'Start Job'
2023-09-15 23:21:48:  0: DEBUG: S3BackedCache Client is not installed.
2023-09-15 23:21:48:  0: INFO: Executing global asset transfer preload script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\65047673c714d533149185d8\GlobalAssetTransferPreLoad.py'
2023-09-15 23:21:48:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2023-09-15 23:21:48:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in C:/Program Files/Thinkbox/S3BackedCache/bin/task.py...
2023-09-15 23:21:48:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2023-09-15 23:21:48:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2023-09-15 23:21:48:  0: Done executing plugin command of type 'Start Job'
2023-09-15 23:21:48:  0: Plugin rendering frame(s): 21-24
2023-09-15 23:21:48:  0: Render Thread - Render State transition from = 'Other' to = 'Rendering'
2023-09-15 23:21:49:  0: Executing plugin command of type 'Render Task'
2023-09-15 23:21:49:  0: INFO: Starting Houdini Job
2023-09-15 23:21:49:  0: INFO: Stdout Redirection Enabled: True
2023-09-15 23:21:49:  0: INFO: Stdout Handling Enabled: True
2023-09-15 23:21:49:  0: INFO: Popup Handling Enabled: True
2023-09-15 23:21:49:  0: INFO: QT Popup Handling Enabled: False
2023-09-15 23:21:49:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2023-09-15 23:21:49:  0: INFO: Using Process Tree: True
2023-09-15 23:21:49:  0: INFO: Hiding DOS Window: True
2023-09-15 23:21:49:  0: INFO: Creating New Console: False
2023-09-15 23:21:49:  0: INFO: Running as user: Kayla
2023-09-15 23:21:49:  0: Done executing plugin command of type 'Render Task'

=======================================================
Details
=======================================================
Date: 09/15/2023 23:21:52
Frames: 21-24
Elapsed Time: 00:00:00:06
Job Submit Date: 09/15/2023 23:21:23
Job User: kayla
Average RAM Usage: 14641614848 (43%)
Peak RAM Usage: 14641614848 (43%)
Average CPU Usage: 1%
Peak CPU Usage: 2%
Used CPU Clocks (x10^6 cycles): 1102
Total CPU Clocks (x10^6 cycles): 110140

=======================================================
Worker Information
=======================================================
Worker Name: DESKTOP-1U8VP0C
Version: v10.1.22.5 Release (919d43564)
Operating System: Windows 11 Pro
Running As Service: No
Machine User: Kayla
IP Address: 192.168.50.249
MAC Address: 48:E7:DA:F4:8E:23
CPU Architecture: x64
CPUs: 16
CPU Usage: 2%
Memory Usage: 13.6 GB / 31.8 GB (42%)
Free Disk Space: 1.892 TB (446.791 GB on C:\, 1.455 TB on D:\)
Video Card: NVIDIA GeForce RTX 3070

@MustafaJafar
Copy link
Copy Markdown
Member Author

@moonyuet
It's a little bit strange that there's nothing in log about injected openpype environments!

There should be there a line that prints contents['PATH']

@moonyuet
Copy link
Copy Markdown
Member

@moonyuet It's a little bit strange that there's nothing in log about injected openpype environments!

There should be there a line that prints contents['PATH']

I find out that I have disabled the ProcessSubmitPublishJobOnFarm for testing, will be testing the PR with that being enabled.

@moonyuet
Copy link
Copy Markdown
Member

It still errors out even if I turn on the submit publish job on farm.
image
The error log is below

=======================================================
Error
=======================================================
Error: FailRenderException : Houdini 19_5 hython executable was not found in the semicolon separated list "hython.exe;hython;". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor.
   at Deadline.Plugins.DeadlinePlugin.FailRender(String message) (Python.Runtime.PythonException)
  File "C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\Houdini.py", line 99, in RenderExecutable
    self.FailRender( "Houdini " + version + " hython executable was not found in the semicolon separated list \"" + houdiniExeList + "\". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor." )
   at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
   at __FranticX_GenericDelegate0`1\[\[System_String\, System_Private_CoreLib\, Version=4_0_0_0\, Culture=neutral\, PublicKeyToken=7cec85d7bea7798e\]\]Dispatcher.Invoke()
   at FranticX.Processes.ManagedProcess.RenderExecutable()
   at Deadline.Plugins.DeadlinePlugin.RenderExecutable()
   at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit)
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)

=======================================================
Type
=======================================================
RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgj, CancellationToken bgk)
   at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajq, CancellationToken ajr)

=======================================================
Log
=======================================================
2023-09-18 16:23:25:  0: Render Thread - Render State transition from = 'ReceivedTask' to = 'Other'
2023-09-18 16:23:25:  0: Loading Job's Plugin timeout is Disabled
2023-09-18 16:23:25:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'Kayla'
2023-09-18 16:23:27:  0: Loaded plugin Houdini
2023-09-18 16:23:27:  0: Executing plugin command of type 'Initialize Plugin'
2023-09-18 16:23:27:  0: INFO: Executing plugin script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\Houdini.py'
2023-09-18 16:23:27:  0: INFO: Plugin execution sandbox using Python version 3
2023-09-18 16:23:27:  0: INFO: About: Houdini Plugin for Deadline
2023-09-18 16:23:27:  0: INFO: The job's environment will be merged with the current environment before rendering
2023-09-18 16:23:27:  0: Done executing plugin command of type 'Initialize Plugin'
2023-09-18 16:23:27:  0: Start Job timeout is disabled.
2023-09-18 16:23:27:  0: Task timeout is disabled.
2023-09-18 16:23:27:  0: Loaded job: new_project_ep_asset_v222.hiplc - redshift_ropMain (650808c33d53d716e88b6e0c)
2023-09-18 16:23:27:  0: Executing plugin command of type 'Start Job'
2023-09-18 16:23:27:  0: DEBUG: S3BackedCache Client is not installed.
2023-09-18 16:23:27:  0: INFO: Executing global asset transfer preload script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\GlobalAssetTransferPreLoad.py'
2023-09-18 16:23:27:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2023-09-18 16:23:27:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in C:/Program Files/Thinkbox/S3BackedCache/bin/task.py...
2023-09-18 16:23:27:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2023-09-18 16:23:27:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2023-09-18 16:23:27:  0: Done executing plugin command of type 'Start Job'
2023-09-18 16:23:27:  0: Plugin rendering frame(s): 1-10
2023-09-18 16:23:27:  0: Render Thread - Render State transition from = 'Other' to = 'Rendering'
2023-09-18 16:23:28:  0: Executing plugin command of type 'Render Task'
2023-09-18 16:23:28:  0: INFO: Starting Houdini Job
2023-09-18 16:23:28:  0: INFO: Stdout Redirection Enabled: True
2023-09-18 16:23:28:  0: INFO: Stdout Handling Enabled: True
2023-09-18 16:23:28:  0: INFO: Popup Handling Enabled: True
2023-09-18 16:23:28:  0: INFO: QT Popup Handling Enabled: False
2023-09-18 16:23:28:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2023-09-18 16:23:28:  0: INFO: Using Process Tree: True
2023-09-18 16:23:28:  0: INFO: Hiding DOS Window: True
2023-09-18 16:23:28:  0: INFO: Creating New Console: False
2023-09-18 16:23:28:  0: INFO: Running as user: Kayla
2023-09-18 16:23:28:  0: Done executing plugin command of type 'Render Task'

=======================================================
Details
=======================================================
Date: 09/18/2023 16:23:31
Frames: 1-10
Elapsed Time: 00:00:00:06
Job Submit Date: 09/18/2023 16:22:27
Job User: kayla
Average RAM Usage: 15882478592 (47%)
Peak RAM Usage: 15882477568 (47%)
Average CPU Usage: 4%
Peak CPU Usage: 10%
Used CPU Clocks (x10^6 cycles): 4378
Total CPU Clocks (x10^6 cycles): 109431

=======================================================
Worker Information
=======================================================
Worker Name: DESKTOP-1U8VP0C
Version: v10.1.22.5 Release (919d43564)
Operating System: Windows 11 Pro
Running As Service: No
Machine User: Kayla
IP Address: 192.168.50.249
MAC Address: 48:E7:DA:F4:8E:23
CPU Architecture: x64
CPUs: 16
CPU Usage: 3%
Memory Usage: 14.8 GB / 31.8 GB (46%)
Free Disk Space: 1.907 TB (462.182 GB on C:\, 1.455 TB on D:\)
Video Card: NVIDIA GeForce RTX 3070

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 18, 2023

Could you share your log for the FIRST task your machine does on that machine? The GlobalJobPreLoad only runs before the Job starts and not before every task - and thus the log of the GlobalJobPreLoad itself would only be visible on logs for a machine that is starting the job and not whenever it goes to the next frame on a job as far as I know. That's also why your shared logs I believe are lacking any logs/prints from the GlobalJobPreLoad script.

In short, make sure to check the logs for the FIRST TASK a machine starts the job with.

Also, make sure to 'resubmit' the job whenever you update the GlobalJobPreLoad file which this PR does. By default I believe Deadline 'caches' on the local machine the plugin files on the first run for a job, so resubmitting is the safest route. Otherwise it could be if you just requeue that it will still use the old plugin files.

@moonyuet
Copy link
Copy Markdown
Member

Could you share your log for the FIRST task your machine does on that machine? The GlobalJobPreLoad only runs before the Job starts and not before every task - and thus the log of the GlobalJobPreLoad itself would only be visible on logs for a machine that is starting the job and not whenever it goes to the next frame on a job as far as I know. That's also why your shared logs I believe are lacking any logs/prints from the GlobalJobPreLoad script.

In short, make sure to check the logs for the FIRST TASK a machine starts the job with.

Also, make sure to 'resubmit' the job whenever you update the GlobalJobPreLoad file which this PR does. By default I believe Deadline 'caches' on the local machine the plugin files on the first run for a job, so resubmitting is the safest route. Otherwise it could be if you just requeue that it will still use the old plugin files.

This is the error log for the first task(task id 0)
image

I have tried the re-submission too...
image

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 20, 2023

Could you please post the full log? :) You seem to be cutting of the part of the log that'd actually be showing the output of the GlobalJobPreLoad logs - which tend to be further down. That the error is the same I understand, but I'm mostly interested in seeing/reading what GlobalJobPreLoad ends up doing. What PATH does it mention? What env vars does it set, etc.?

Also, minor side note that task id being zero doesn't necessarily mean that your machine STARTED the job on that task. It could've been running the job on tasks 0, 1, 2 and then you requeued 0 as it was running and now it would process task id 0 again but it'd still be a continuous job run from 1, 2 if it was still rendering the job. ;)

@moonyuet
Copy link
Copy Markdown
Member

moonyuet commented Sep 20, 2023

Could you please post the full log? :) You seem to be cutting of the part of the log that'd actually be showing the output of the GlobalJobPreLoad logs - which tend to be further down. That the error is the same I understand, but I'm mostly interested in seeing/reading what GlobalJobPreLoad ends up doing. What PATH does it mention? What env vars does it set, etc.?

Also, minor side note that task id being zero doesn't necessarily mean that your machine STARTED the job on that task. It could've been running the job on tasks 0, 1, 2 and then you requeued 0 as it was running and now it would process task id 0 again but it'd still be a continuous job run from 1, 2 if it was still rendering the job. ;)

image
They are all with the same error and the logs shown below.

=======================================================
Error
=======================================================
Error: FailRenderException : Houdini 19_5 hython executable was not found in the semicolon separated list "hython.exe;hython;". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor.
   at Deadline.Plugins.DeadlinePlugin.FailRender(String message) (Python.Runtime.PythonException)
  File "C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\Houdini.py", line 99, in RenderExecutable
    self.FailRender( "Houdini " + version + " hython executable was not found in the semicolon separated list \"" + houdiniExeList + "\". The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor." )
   at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
   at __FranticX_GenericDelegate0`1\[\[System_String\, System_Private_CoreLib\, Version=4_0_0_0\, Culture=neutral\, PublicKeyToken=7cec85d7bea7798e\]\]Dispatcher.Invoke()
   at FranticX.Processes.ManagedProcess.RenderExecutable()
   at Deadline.Plugins.DeadlinePlugin.RenderExecutable()
   at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit)
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)

=======================================================
Type
=======================================================
RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgj, CancellationToken bgk)
   at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajq, CancellationToken ajr)

=======================================================
Log
=======================================================
2023-09-18 16:22:58:  0: Render Thread - Render State transition from = 'ReceivedTask' to = 'Other'
2023-09-18 16:22:58:  0: Loading Job's Plugin timeout is Disabled
2023-09-18 16:22:58:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'Kayla'
2023-09-18 16:23:00:  0: Loaded plugin Houdini
2023-09-18 16:23:00:  0: Executing plugin command of type 'Initialize Plugin'
2023-09-18 16:23:00:  0: INFO: Executing plugin script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\Houdini.py'
2023-09-18 16:23:00:  0: INFO: Plugin execution sandbox using Python version 3
2023-09-18 16:23:00:  0: INFO: About: Houdini Plugin for Deadline
2023-09-18 16:23:00:  0: INFO: The job's environment will be merged with the current environment before rendering
2023-09-18 16:23:00:  0: Done executing plugin command of type 'Initialize Plugin'
2023-09-18 16:23:00:  0: Start Job timeout is disabled.
2023-09-18 16:23:00:  0: Task timeout is disabled.
2023-09-18 16:23:00:  0: Loaded job: new_project_ep_asset_v222.hiplc - redshift_ropMain (650808c33d53d716e88b6e0c)
2023-09-18 16:23:00:  0: Executing plugin command of type 'Start Job'
2023-09-18 16:23:00:  0: DEBUG: S3BackedCache Client is not installed.
2023-09-18 16:23:00:  0: INFO: Executing global asset transfer preload script 'C:\ProgramData\Thinkbox\Deadline10\workers\desktop-1u8vp0c\plugins\650808c33d53d716e88b6e0c\GlobalAssetTransferPreLoad.py'
2023-09-18 16:23:00:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2023-09-18 16:23:00:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in C:/Program Files/Thinkbox/S3BackedCache/bin/task.py...
2023-09-18 16:23:00:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2023-09-18 16:23:00:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2023-09-18 16:23:00:  0: Done executing plugin command of type 'Start Job'
2023-09-18 16:23:00:  0: Plugin rendering frame(s): 1-10
2023-09-18 16:23:00:  0: Render Thread - Render State transition from = 'Other' to = 'Rendering'
2023-09-18 16:23:00:  0: Executing plugin command of type 'Render Task'
2023-09-18 16:23:00:  0: INFO: Starting Houdini Job
2023-09-18 16:23:00:  0: INFO: Stdout Redirection Enabled: True
2023-09-18 16:23:00:  0: INFO: Stdout Handling Enabled: True
2023-09-18 16:23:00:  0: INFO: Popup Handling Enabled: True
2023-09-18 16:23:00:  0: INFO: QT Popup Handling Enabled: False
2023-09-18 16:23:00:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2023-09-18 16:23:00:  0: INFO: Using Process Tree: True
2023-09-18 16:23:00:  0: INFO: Hiding DOS Window: True
2023-09-18 16:23:00:  0: INFO: Creating New Console: False
2023-09-18 16:23:00:  0: INFO: Running as user: Kayla
2023-09-18 16:23:00:  0: Done executing plugin command of type 'Render Task'

=======================================================
Details
=======================================================
Date: 09/18/2023 16:23:04
Frames: 1-10
Elapsed Time: 00:00:00:06
Job Submit Date: 09/18/2023 16:22:27
Job User: kayla
Average RAM Usage: 15988027392 (47%)
Peak RAM Usage: 15995641856 (47%)
Average CPU Usage: 1%
Peak CPU Usage: 3%
Used CPU Clocks (x10^6 cycles): 1053
Total CPU Clocks (x10^6 cycles): 105257

=======================================================
Worker Information
=======================================================
Worker Name: DESKTOP-1U8VP0C
Version: v10.1.22.5 Release (919d43564)
Operating System: Windows 11 Pro
Running As Service: No
Machine User: Kayla
IP Address: 192.168.50.249
MAC Address: 48:E7:DA:F4:8E:23
CPU Architecture: x64
CPUs: 16
CPU Usage: 2%
Memory Usage: 14.9 GB / 31.8 GB (46%)
Free Disk Space: 1.907 TB (462.182 GB on C:\, 1.455 TB on D:\)
Video Card: NVIDIA GeForce RTX 3070

It doesn't print PATH in the log.
Just read the GlobalJobPreLoad.py a little bit and got a maybe-very-dumb question. is the {PATH} only applicable for the openpype publish job only by this case? Can we try to set up this environment instead in the Submit Render for Houdini?

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 20, 2023

No the PATH would come from OpenPype's settings for the application being launched (in this case houdini) and since it triggers through GlobalJobPreLoad it runs for ALL Deadline jobs. That is if you have it configured in your studio settings as you showed here.

You seem to be lacking ANY output from the GlobalJobPreLoad.py script which makes me think you don't have it installed for your Deadline maybe?
Some documentation on that is here (in particular step 5) and here.

You should at the very least see the print output of these two lines and thus should see in your logs:

*** GlobalJobPreload start ...
>>> Getting job ...

That is no matter what plugin you are running and no matter how your job is submitted through Deadline - if that GlobalJobPreLoad is installed it should always go through that __main__ function and thus always print that. If not, then there's something with that Python file not working on your machine/deadline but my hunch is that you just don't have it installed.


It's good to note that HOW you've configured here is actually wrong too. You don't want to set that absolute Houdini path in the PATH in the global env for Houdini application but set the correct Houdini path per version.

In your case since you have HOUDINI_VERSION env var set per application version your global Houdini environment can be set as:

{
    "PATH": [
        "C:/Program Files/Side Effects Software/Houdini {HOUDINI_VERSION}/bin",
        "{PATH}"
    ]
}

Note: You do not replace {HOUDINI_VERSION} - you keep this json exactly like that in your system settings so that it takes the HOUDINI_VERSION value that you have configured per version. Otherwise whatever job you'd submit it'd always load your 19.5.435 Houdini version since that is always the one you're setting on PATH.

@fabiaserra
Copy link
Copy Markdown
Contributor

Repeating here what I said on Discord, not all infrastructures will have the same environment for Deadline workers than the local one thus why we would want to keep full control of the PATH used in Deadline versus what's used locally. Our Deadline workers for example are running as Dockerized containers and the binaries location where the DCCs are installed doesn't match the PATH of the workstation that submitted the job. I don't think you should be merging this change and someone could argue that the GlobalJobPreLoad is actually already injecting too many env vars into the worker and we should control what gets passed (i.e., PYTHONPATH could cause problems as well if taken from the local workstation)

@BigRoy
Copy link
Copy Markdown
Collaborator

BigRoy commented Sep 20, 2023

Repeating here what I said on Discord, not all infrastructures will have the same environment for Deadline workers than the local one thus why we would want to keep full control of the PATH used in Deadline versus what's used locally. Our Deadline workers for example are running as Dockerized containers and the binaries location where the DCCs are installed doesn't match the PATH of the workstation that submitted the job. I don't think you should be merging this change and someone could argue that the GlobalJobPreLoad is actually already injecting too many env vars into the worker and we should control what gets passed (i.e., PYTHONPATH could cause problems as well if taken from the local workstation)

It could just be a toggle in the Deadline Plugin Settings by the way - trivial as that.
Also, note that it's still constructing PATH using the local environment since it goes through the same launching process as the actual applications. (Actually Houdini that launches itself has the EXACT same environment already as it has now; the only thing that changes is that we're also setting it for the Deadline Plugin's context itself). Additionally note that it only sets it if the studio settings define PATH for the running application (or the global environment).

(i.e., PYTHONPATH could cause problems as well if taken from the local workstation)

Again, we're not passing the source machine's environment. We're injecting the Application environment that OpenPype's Application launching builds. That environment is computed locally and build locally on the machine, not passed along from where it got submitted at all.

Is that maybe a misinterpretation you've made? I feel like I understand your worry if we're just randomly injecting user environments to the farm but in practice this PR does nothing than just also adding PATH (that is already set for any process launched from Deadline Plugin currently!) for the current Deadline sandbox environment so that it can find an executable by filename, like regular which houdini would then return - if the Deadline Settings actually specify full paths to executables than this PR as far as I can see changes nothing in behavior for the launched processes from Deadline than what GlobalJobPreLoad already does.

Repeating here what I said on Discord,

For completeness sake, here's that discord post for future reference.

@fabiaserra
Copy link
Copy Markdown
Contributor

Repeating here what I said on Discord, not all infrastructures will have the same environment for Deadline workers than the local one thus why we would want to keep full control of the PATH used in Deadline versus what's used locally. Our Deadline workers for example are running as Dockerized containers and the binaries location where the DCCs are installed doesn't match the PATH of the workstation that submitted the job. I don't think you should be merging this change and someone could argue that the GlobalJobPreLoad is actually already injecting too many env vars into the worker and we should control what gets passed (i.e., PYTHONPATH could cause problems as well if taken from the local workstation)

It could just be a toggle in the Deadline Plugin Settings by the way - trivial as that. Also, note that it's still constructing PATH using the local environment since it goes through the same launching process as the actual applications. (Actually Houdini that launches itself has the EXACT same environment already as it has now; the only thing that changes is that we're also setting it for the Deadline Plugin's context itself). Additionally note that it only sets it if the studio settings define PATH for the running application (or the global environment).

(i.e., PYTHONPATH could cause problems as well if taken from the local workstation)

Again, we're not passing the source machine's environment. We're injecting the Application environment that OpenPype's Application launching builds. That environment is computed locally and build locally on the machine, not passed along from where it got submitted at all.

Is that maybe a misinterpretation you've made? I feel like I understand your worry if we're just randomly injecting user environments to the farm but in practice this PR does nothing than just also adding PATH (that is already set for any process launched from Deadline Plugin currently!) for the current Deadline sandbox environment so that it can find an executable by filename, like regular which houdini would then return - if the Deadline Settings actually specify full paths to executables than this PR as far as I can see changes nothing in behavior for the launched processes from Deadline than what GlobalJobPreLoad already does.

Repeating here what I said on Discord,

For completeness sake, here's that discord post for future reference.

Ohh you are right, I had a lapsus thinking where these env vars were being passed, I thought these were being set as the job properties of Deadline and could affect other processes. Yeah that makes sense, sorry for the confusion, should be okay to merge

@moonyuet
Copy link
Copy Markdown
Member

moonyuet commented Sep 21, 2023

No the PATH would come from OpenPype's settings for the application being launched (in this case houdini) and since it triggers through GlobalJobPreLoad it runs for ALL Deadline jobs. That is if you have it configured in your studio settings as you showed here.

You seem to be lacking ANY output from the GlobalJobPreLoad.py script which makes me think you don't have it installed for your Deadline maybe? Some documentation on that is here (in particular step 5) and here.

You should at the very least see the print output of these two lines and thus should see in your logs:

*** GlobalJobPreload start ...
>>> Getting job ...

That is no matter what plugin you are running and no matter how your job is submitted through Deadline - if that GlobalJobPreLoad is installed it should always go through that __main__ function and thus always print that. If not, then there's something with that Python file not working on your machine/deadline but my hunch is that you just don't have it installed.

It's good to note that HOW you've configured here is actually wrong too. You don't want to set that absolute Houdini path in the PATH in the global env for Houdini application but set the correct Houdini path per version.

In your case since you have HOUDINI_VERSION env var set per application version your global Houdini environment can be set as:

{
    "PATH": [
        "C:/Program Files/Side Effects Software/Houdini {HOUDINI_VERSION}/bin",
        "{PATH}"
    ]
}

Note: You do not replace {HOUDINI_VERSION} - you keep this json exactly like that in your system settings so that it takes the HOUDINI_VERSION value that you have configured per version. Otherwise whatever job you'd submit it'd always load your 19.5.435 Houdini version since that is always the one you're setting on PATH.

It seems that I am still using the old version of repo for that....(I forgot to copy the new GlobalPreJobLoad.py into the custom folder to replace the old one)

Copy link
Copy Markdown
Member

@moonyuet moonyuet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It works now, Thanks for @BigRoy guiding me on this...(It wont work because I didn't update the GlobaJobPreLoad.py in the deadline folder)

image

@moonyuet moonyuet merged commit 2c18900 into develop Sep 21, 2023
@ynbot ynbot added this to the next-patch milestone Sep 21, 2023
@moonyuet moonyuet deleted the enhancement/OP-6381_Update-deadline-GlobalJobPreLoad-to-update-job-env branch September 21, 2023 03:04
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

module: Deadline AWS Deadline related features size/XS Denotes a PR changes 0-99 lines, ignoring general files type: enhancement Enhancements to existing functionality

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

5 participants