Skip to content

Conversation

@jchen351
Copy link
Contributor

…-andriod-e2e-test-job.yml

Description

Motivation and Context

@jchen351 jchen351 changed the title comiit tools/ci_build/github/azure-pipelines/stages/jobs/react-natvie… Update react-native to 0.72 Feb 3, 2025
# Conflicts:
#	js/react_native/e2e/android/gradle.properties
#	js/react_native/e2e/android/settings.gradle
#	js/react_native/e2e/ios/OnnxruntimeModuleExample.xcodeproj/project.pbxproj
#	js/react_native/e2e/ios/Podfile
#	js/react_native/e2e/ios/PrivacyInfo.xcprivacy
#	js/react_native/e2e/package.json
#	js/react_native/ios/OnnxruntimeModule.xcodeproj/project.pbxproj
#	js/react_native/ios/Podfile
#	js/react_native/ios/PrivacyInfo.xcprivacy
#	js/react_native/package.json
@jchen351 jchen351 marked this pull request as ready for review February 3, 2025 18:47
@jchen351 jchen351 requested review from edgchen1 and snnn February 3, 2025 21:25
Copy link
Contributor

@snnn snnn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@jchen351 jchen351 merged commit b2560a7 into main Feb 4, 2025
160 checks passed
@jchen351 jchen351 deleted the Cjian/RN-72 branch February 4, 2025 17:53
sfatimar pushed a commit to intel/onnxruntime that referenced this pull request Feb 5, 2025
…-andriod-e2e-test-job.yml

### Description
<!-- Describe your changes. -->



### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
sfatimar pushed a commit to intel/onnxruntime that referenced this pull request Feb 5, 2025
…-andriod-e2e-test-job.yml

### Description
<!-- Describe your changes. -->



### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
ashrit-ms pushed a commit that referenced this pull request Feb 11, 2025
…-andriod-e2e-test-job.yml

### Description
<!-- Describe your changes. -->



### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
ashrit-ms added a commit that referenced this pull request Feb 11, 2025
### Description
This PR is to update the win-ort-main branch to the tip main branch as
of 2025-02-11.

### PR List
74c778e [WebNN EP] Automatically move input CPU tensors to ml-tensor
(#23073)
3775057 use correct total length to fix static kv_cache performance
(#23615)
3901e96 remove --use_vcpkg flag for Python-CUDA-Packaging-Pipeline
(#23631)
c610df5 Add python_requires to package metadata (#23604)
2d27d68 [QNN EP] Add QNN EP to ARM64X build targets (#23635)
e666503 [webgpu] no longer need pass-in gpu adapter for custom
context (#23593)
af679a0 Fix logic for selecting alternate name for blob (#23617)
e206950 [ARM CPU] Add fp16 mlas kernels for exp, tanh, softmax,
logsoftmax, softcap (#23597)
9ba5619 Update pybind and json to the latest (#23589)
c54736c Migrate iOS release pipeline to 1 ES (#23606)
3981326 Increase timeout for Windows TensorRT CI (#23625)
0274b7b fix on trtCudaVersion (#23616)
740e9ab update run CI script (#23621)
5ef1832 [WebGPU] Support PIX Capture for WebGPU EP (#23192)
0114551 Fix for C4267 warning (#23610)
002916a Validate the context_file_path before EP compile graphs
(#23611)
0887e36 [webgpu] Use pushErrorScope()/popErrorScope() once for an
inference run (#23438)
65008cb Auto-generated baselines by 1ES Pipeline Templates (#23603)
09e5724 [CUDA] Fix beam search of num_beams > 32 (#23599)
82840f6 Implement Flash Attention 2 for webgpu EP (#23576)
a6ea57b OpenVINO EP Weights Sharing Feature (#23553)
2c2ff4a [CUDA] Fix BeamSearchTest.DummyT5WithSequenceInputIds test
failure in Windows (#23596)
d981b15 [webgpu/js] Optimize resize webgpu op & fix precision issues
(#23591)
328a13c Enable VCPKG in more pipelines (#23590)
6728d60 [TensorRT EP] support TensorRT 10.8-GA (#23592)
d1fb58b Quantization tool: Allow user to override calibrator's
session EP (#23559)
649ced4 Enable user loading model with external data from memory
buffer (#23557)
544bdd6 Fix ConvTranspose for certain attribute combinations (#23488)
8f6ddf3 Delete extra cgmanifest entries and files (#23583)
5f6a315 Enable VCPKG in CI build (#23426)
e1e3f62 Bump lintrunner from 0.12.5 to 0.12.7 (#23326)
cd8775f Fix Node JS Samples (#23581)
6b4f9c4 [WebGPU EP] Batch Norm Implementation (#23525)
1fce51b Fix all instances of 4244 and 4267 warnings in OV EP code
(#23567)
c29ca1c Update QNN default version to 2.31 (#23573)
2fc75a4 [mobile] Add Android BrowserStack test project back (#23551)
9e18b6a [CUDA] Update nvcc flags (#23572)
b47e1e6 [QNN EP] Make offloading graph input/output quantization (to
CPU) the default (#23368)
75a9b40 [ROCm] Update CI to use rocm 6.3.2 (#23577)
26ff2b6 Bump ruff from 0.9.3 to 0.9.4 (#23563)
b2560a7 Update react-native to 0.72 (#23509)
faee912 [js] update JavaScript API to support QNN EP options (#23486)
816e8cb [EP Perf] Update env to ubuntu 22.04 (#23570)
cddc271 Use Eigen in Round implementation (#23571)
e8b0bdb Shape inference: ReduceMean dispatcher, quant_pre_process:
skip_symbolic_shape bugfix (#23558)
267b493 delete the supported domain version upper bounds (#23237)
bb7f961 remove log spam from cpuinfo (#23548)
169917b Use latest vcpkg commit in configuration, sync manifest with
deps.txt (#23554)
a9d4d08 Add of ReduceMax Gradient (#23501)
6bbf1bd [js/web] upgrade version of flatbuffers (#23545)
271c509 DP4AMatMul perf refinements (#23539)
cb69c59 Add fusions for SigLIP and Conformer-Encoder (#23528)
61fae9b Remove "--enable_pybind" from webgpu pipeline (#23550)
0bb4ea6 Update BiasGelu fusion and related ops (#23518)
4dde74a Add more details to BrowserStack script failure (#23520)
ead9d5c Set ANDROID_USE_LEGACY_TOOLCHAIN_FILE to false (#23544)
7e24088 Enable dlpack by default (#23110)
dc2f7a9 Add overload of `TryParseStringWithClassicLocale()` that uses
`std::from_chars()` (#23541)
5407c69 Fix the issue that the new generated EP context model not
able to find external data (#23537)
fbae88f [js/web] use the recommended workaround for Vite (#23531)
d5338da Fix tensor external data info length parsing issue. (#23526)
e3e4173 [ROCm EP] Fix transpose helper for gfx gridsize constraints
(#23527)
80bc1d2 Enable Ep context with external data for CPU nodes (#23498)
bf023ab [js/web] allow import .mjs/.wasm file (#23487)
655a23f [onnxruntime/build] Add new flag enable_generic_interface to
build primary EPs by default (#23342)
a770a8d Update RN to 0.71.19 (#23381)
1cf0ebd Delete Prefast workflow until the build failure is fixed
(#23510)
d2c5e24 Add of GlobalMaxPool Gradient (#23502)
ded8730 Remove thrust::unary_function (#23506)
8db97a6 [webgpu] Bump version of Dawn to b9b4a370 (#23494)
fdde2e2 Fix for gcc 13.3.1: Avoid creating a copy (#23500)
96ec1dd Bump ruff from 0.9.2 to 0.9.3 (#23496)
42f0c00 Adds the new System.Numerics.Tensors as an input/output type
when using dotnet 8.0 and up. (#23261)
97c2bbe Fix shape infer of onnx GroupNorm (#23477)
1fc9c48 Enable coremltools for Linux build (#23481)
13348c5 [ARM CPU] hgemm optimized for gqa (#23107)
c89a798 Enable opti on Microsoft.ML.OnnxRuntime with RelWithDebInfo
config (#23463)
d00ae32 Revert "[Mobile] Add BrowserStack Android MAUI Test (#23383)"
(#23474)
8b1d3b3 Align AvgPool ceil_mode on last value to torch (#16752)
06fc73b [TRT EP Perf Tool] Add annotations import to python script to
support annotations on Python 3.8 (#23466)

### Motivation and Context
This update includes the change to add QNN EP to ARM64X build targets.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Adrian Lizarraga <adlizarraga@microsoft.com>
Co-authored-by: Ti-Tai Wang <titaiwang@microsoft.com>
Co-authored-by: Caroline Zhu <wolfivyaura@gmail.com>
Co-authored-by: Grégoire <gregoire.verdier@gmail.com>
Co-authored-by: Jing Fang <126209182+fajin-corp@users.noreply.github.com>
Co-authored-by: Changming Sun <chasun@microsoft.com>
Co-authored-by: Yateng Hong <yatengh@microsoft.com>
Co-authored-by: Michael Sharp <51342856+michaelgsharp@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Malik Shahzad Muzaffar <shahzad.malik.muzaffar@cern.ch>
Co-authored-by: Yulong Wang <7679871+fs-eire@users.noreply.github.com>
Co-authored-by: Dmitri Smirnov <yuslepukhin@users.noreply.github.com>
Co-authored-by: Corentin Maravat <101636442+cocotdf@users.noreply.github.com>
Co-authored-by: Jian Chen <cjian@microsoft.com>
Co-authored-by: Karim Vadsariya <karim.vadsariya@microsoft.com>
Co-authored-by: Lei Cao <jslhcl@gmail.com>
Co-authored-by: Karim Vadsariya <kvadsariya@microsoft.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Hector Li <hecli@microsoft.com>
Co-authored-by: Ted Themistokleous <107195283+TedThemistokleous@users.noreply.github.com>
Co-authored-by: Ted Themistokleous <tedthemistokleous@amd.com>
Co-authored-by: Edward Chen <18449977+edgchen1@users.noreply.github.com>
Co-authored-by: Takeshi Watanabe <take-cheeze@users.noreply.github.com>
Co-authored-by: Xavier Dupré <xadupre@users.noreply.github.com>
Co-authored-by: Justin Chu <justinchuby@users.noreply.github.com>
Co-authored-by: Tianlei Wu <tlwu@microsoft.com>
Co-authored-by: kunal-vaishnavi <115581922+kunal-vaishnavi@users.noreply.github.com>
Co-authored-by: Sushanth Rajasankar <44513542+sushraja-msft@users.noreply.github.com>
Co-authored-by: PARK DongHa <luncliff@gmail.com>
Co-authored-by: George Wu <jywu@microsoft.com>
Co-authored-by: Xinpeng Dou <15529241576@163.com>
Co-authored-by: Jambay Kinley <jambaykinley@microsoft.com>
Co-authored-by: Yifan Li <109183385+yf711@users.noreply.github.com>
Co-authored-by: Gavin Kinsey <98115505+ms-gavinkinsey@users.noreply.github.com>
Co-authored-by: Prathik Rao <prathik.rao@gmail.com>
Co-authored-by: Jon Campbell <jcampbell@cephable.com>
Co-authored-by: Satya Kumar Jandhyala <satya.k.jandhyala@gmail.com>
Co-authored-by: Joshua Lochner <admin@xenova.com>
Co-authored-by: Ankit Maheshkar <ankit.maheshkar@intel.com>
Co-authored-by: jatinwadhwa921 <jatin.wadhwa@intel.com>
Co-authored-by: jatinwadhwa921 <110383850+jatinwadhwa921@users.noreply.github.com>
Co-authored-by: saurabh <saurabh1.kale@intel.com>
Co-authored-by: TejalKhade28 <tejal.khade@intel.com>
Co-authored-by: sfatimar <sahar.fatima@intel.com>
Co-authored-by: Javier E. Martinez <javier.e.martinez@intel.com>
Co-authored-by: Preetha Veeramalai <preetha.veeramalai@intel.com>
Co-authored-by: Eric Crawford <eric.r.crawford@intel.com>
Co-authored-by: microsoft-github-policy-service[bot] <77245923+microsoft-github-policy-service[bot]@users.noreply.github.com>
Co-authored-by: Jie Chen <jie.a.chen@intel.com>
Co-authored-by: shaoboyan091 <shaoboyan@microsoft.com>
Co-authored-by: David Hotham <david.hotham@microsoft.com>
Co-authored-by: Guenther Schmuelling <guschmue@microsoft.com>
Co-authored-by: Enrico Galli <enrico.galli@intel.com>
guschmue pushed a commit that referenced this pull request Mar 6, 2025
…-andriod-e2e-test-job.yml

### Description
<!-- Describe your changes. -->



### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
ashrit-ms pushed a commit that referenced this pull request Mar 17, 2025
…-andriod-e2e-test-job.yml

### Description
<!-- Describe your changes. -->



### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
snnn added a commit that referenced this pull request Nov 4, 2025
On case-sensitive filesystems (Windows with WSL, developer mode, or
per-directory case sensitivity), DllImport fails to load native
libraries due to relying on .NET's automatic platform-specific
extension/prefix addition, which can produce incorrect casing.

## Changes

- **NativeMethods.shared.cs**: Changed desktop platform library names
from `"onnxruntime"` to `"onnxruntime.dll"` and `"ortextensions"` to
`"ortextensions.dll"`
- Explicitly specifying extensions ensures consistent behavior across
case-sensitive and case-insensitive filesystems
- Android/iOS platform-specific names unchanged

## Impact

**Windows**: No changes required - libraries already named
`onnxruntime.dll`

**Linux/macOS**: Native packaging may need updates to provide
`onnxruntime.dll` in runtime folders (either as actual filename or
symlink to `libonnxruntime.so`/`libonnxruntime.dylib`)

```csharp
// Before (relied on automatic extension addition)
internal const string DllName = "onnxruntime";

// After (explicit extension for consistency)
internal const string DllName = "onnxruntime.dll";
```

Fixes #23509

<!-- START COPILOT CODING AGENT SUFFIX -->



<details>

<summary>Original prompt</summary>

> 
> ----
> 
> *This section details on the original issue you should resolve*
> 
> <issue_title>Does not work on case-sensitive filesystems</issue_title>
> <issue_description>### Describe the issue
> 
> Library does not work on case-sensitive filesystems. We get:
> 
> ```
> Unhandled exception. System.TypeInitializationException: The type
initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an
exception.
> ---> System.EntryPointNotFoundException: Unable to find an entry point
named 'OrtGetApiBase' in DLL 'onnxruntime'.
>    at Microsoft.ML.OnnxRuntime.NativeMethods.OrtGetApiBase()
>    at Microsoft.ML.OnnxRuntime.NativeMethods..cctor()
>    --- End of inner exception stack trace ---
>    at Microsoft.ML.OnnxRuntime.SessionOptions..ctor()
> at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath)
> at Program.<Main>$(String[] args) in Z:\temp\onnxtest\Program.cs:line
1
> ```
> 
> Probably due to a mistyped filename somewhere.
> 
> ### To reproduce
> 
> Create new C# project, use this Program.cs:
> 
> ```
> new Microsoft.ML.OnnxRuntime.InferenceSession("");
> ```
> 
> 
> ### Urgency
> 
> _No response_
> 
> ### Platform
> 
> Windows
> 
> ### OS Version
> 
> Microsoft Windows [Version 10.0.19045.6332]
> 
> ### ONNX Runtime Installation
> 
> Released Package
> 
> ### ONNX Runtime Version or Commit ID
> 
> 1.22.1
> 
> ### ONNX Runtime API
> 
> Other / Unknown
> 
> ### Architecture
> 
> X64
> 
> ### Execution Provider
> 
> Other / Unknown
> 
> ### Execution Provider Library Version
> 
> _No response_</issue_description>
> 
> ## Comments on the Issue (you are @copilot in this section)
> 
> <comments>
> <comment_new><author>@snnn</author><body>
> I checked the code, still do not have much clue. We didn't specify the
extension name. Maybe it was added by the .net runtime.
> 
>
https://github.com/microsoft/onnxruntime/blob/main/csharp/src/Microsoft.ML.OnnxRuntime/NativeMethods.shared.cs#L831</body></comment_new>
> <comment_new><author>@snnn</author><body>
> > Maybe we should just specify explicit extensions then in DllImport
> 
> That might be the easiest fix. Would like to submit a
PR?</body></comment_new>
> <comment_new><author>@snnn</author><body>
> Don't close it.</body></comment_new>
> </comments>
> 


</details>

- Fixes #26129

<!-- START COPILOT CODING AGENT TIPS -->
---

💬 We'd love your input! Share your thoughts on Copilot coding agent in
our [2 minute survey](https://gh.io/copilot-coding-agent-survey).

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: snnn <856316+snnn@users.noreply.github.com>
Co-authored-by: Changming Sun <chasun@microsoft.com>
Rohanjames1997 pushed a commit to Rohanjames1997/onnxruntime that referenced this pull request Dec 4, 2025
…#26415)

On case-sensitive filesystems (Windows with WSL, developer mode, or
per-directory case sensitivity), DllImport fails to load native
libraries due to relying on .NET's automatic platform-specific
extension/prefix addition, which can produce incorrect casing.

## Changes

- **NativeMethods.shared.cs**: Changed desktop platform library names
from `"onnxruntime"` to `"onnxruntime.dll"` and `"ortextensions"` to
`"ortextensions.dll"`
- Explicitly specifying extensions ensures consistent behavior across
case-sensitive and case-insensitive filesystems
- Android/iOS platform-specific names unchanged

## Impact

**Windows**: No changes required - libraries already named
`onnxruntime.dll`

**Linux/macOS**: Native packaging may need updates to provide
`onnxruntime.dll` in runtime folders (either as actual filename or
symlink to `libonnxruntime.so`/`libonnxruntime.dylib`)

```csharp
// Before (relied on automatic extension addition)
internal const string DllName = "onnxruntime";

// After (explicit extension for consistency)
internal const string DllName = "onnxruntime.dll";
```

Fixes microsoft#23509

<!-- START COPILOT CODING AGENT SUFFIX -->



<details>

<summary>Original prompt</summary>

> 
> ----
> 
> *This section details on the original issue you should resolve*
> 
> <issue_title>Does not work on case-sensitive filesystems</issue_title>
> <issue_description>### Describe the issue
> 
> Library does not work on case-sensitive filesystems. We get:
> 
> ```
> Unhandled exception. System.TypeInitializationException: The type
initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an
exception.
> ---> System.EntryPointNotFoundException: Unable to find an entry point
named 'OrtGetApiBase' in DLL 'onnxruntime'.
>    at Microsoft.ML.OnnxRuntime.NativeMethods.OrtGetApiBase()
>    at Microsoft.ML.OnnxRuntime.NativeMethods..cctor()
>    --- End of inner exception stack trace ---
>    at Microsoft.ML.OnnxRuntime.SessionOptions..ctor()
> at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath)
> at Program.<Main>$(String[] args) in Z:\temp\onnxtest\Program.cs:line
1
> ```
> 
> Probably due to a mistyped filename somewhere.
> 
> ### To reproduce
> 
> Create new C# project, use this Program.cs:
> 
> ```
> new Microsoft.ML.OnnxRuntime.InferenceSession("");
> ```
> 
> 
> ### Urgency
> 
> _No response_
> 
> ### Platform
> 
> Windows
> 
> ### OS Version
> 
> Microsoft Windows [Version 10.0.19045.6332]
> 
> ### ONNX Runtime Installation
> 
> Released Package
> 
> ### ONNX Runtime Version or Commit ID
> 
> 1.22.1
> 
> ### ONNX Runtime API
> 
> Other / Unknown
> 
> ### Architecture
> 
> X64
> 
> ### Execution Provider
> 
> Other / Unknown
> 
> ### Execution Provider Library Version
> 
> _No response_</issue_description>
> 
> ## Comments on the Issue (you are @copilot in this section)
> 
> <comments>
> <comment_new><author>@snnn</author><body>
> I checked the code, still do not have much clue. We didn't specify the
extension name. Maybe it was added by the .net runtime.
> 
>
https://github.com/microsoft/onnxruntime/blob/main/csharp/src/Microsoft.ML.OnnxRuntime/NativeMethods.shared.cs#L831</body></comment_new>
> <comment_new><author>@snnn</author><body>
> > Maybe we should just specify explicit extensions then in DllImport
> 
> That might be the easiest fix. Would like to submit a
PR?</body></comment_new>
> <comment_new><author>@snnn</author><body>
> Don't close it.</body></comment_new>
> </comments>
> 


</details>

- Fixes microsoft#26129

<!-- START COPILOT CODING AGENT TIPS -->
---

💬 We'd love your input! Share your thoughts on Copilot coding agent in
our [2 minute survey](https://gh.io/copilot-coding-agent-survey).

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: snnn <856316+snnn@users.noreply.github.com>
Co-authored-by: Changming Sun <chasun@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants