Skip to content

Conversation

@jokasimr
Copy link
Contributor

@jokasimr jokasimr commented Jun 24, 2025

Fixes #180 #181

Initial data reduction workflow for BEER McStas data files.

I've tried to keep it pretty simple and flexible for now.
I'm not sure where this should be "connected" to the regular diffraction workflow, so for now it's a separate workflow.

For now this is a draft because the McStas data loader needs to load chopper information better to be able to properly handle different chopper "modes".

Comments and suggestions are welcome.

def compute_tof_from_known_peaks(
da: DetectorData[RunType], graph: ElasticCoordTransformGraph
) -> DetectorTofData[RunType]:
return da.transform_coords(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is badly named, since it computes dspacing etc in addition to tof. That's just how it works right now to get the demo notebook running without too much effort. I would like to connect to the regular powder diffraction workflow after this point.

'tof': _tof_from_dhkl,
'coarse_dhkl': _compute_d,
'theta': lambda two_theta: two_theta / 2,
'dhkl_list': lambda: dhkl_list,
Copy link
Contributor Author

@jokasimr jokasimr Jul 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jl-wynen in chat you mentioned it's better to generate a new _compute_d function binding dhkl_list in its scope instead of having dhkl_list be an "intermediate" here and then use keep_intermediate=False. The plan is to do that eventually

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you do this now or open an issue to track it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The notebook in its current shape is mainly there to be able to display the current status of the implementation to Celine. When we know more it will be replace by a more useful tutorial notebook.

Comment on lines 10 to 17
da.coords['coarse_d'] = (
constants.h
/ constants.m_n
* (da.coords['event_time_offset'] - da.coords['approximate_t0'])
/ sc.sin(da.coords['two_theta'] / 2)
/ da.coords['L0']
/ 2
).to(unit='angstrom')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bit low-level, but since I noticed this: Can you use https://github.com/scipp/scippneutron/blob/5acb18bc675eba373fae94ea73012d599e1e1f49/src/scippneutron/conversion/tof.py#L61 instead of reimplementing it here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes we can

0.5107,
0.5107,
0.5102,
0.5057,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where do those number come from and what sample do they apply to? And why are they hardcoded in the package?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They are peak positions for a particular test sample. They should not be hardcoded in the package

"wf = BeerMcStasWorkflowKnownPeaks()\n",
"wf[Filename[SampleRun]] = mcstas_duplex(9)\n",
"da = wf.compute(DetectorTofData[SampleRun])\n",
"da.hist(dspacing=dspacing).plot()"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this data have a dspacing coord? It should be in tof, not dspacing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I agree, it should just have tof.

"wf = BeerMcStasWorkflow()\n",
"wf[Filename[SampleRun]] = mcstas_duplex(9)\n",
"da = wf.compute(DetectorTofData[SampleRun])\n",
"da.bins.coords['dspacing'] = (sc.constants.h / sc.constants.m_n * da.bins.coords['tof'] / sc.sin(da.bins.coords['two_theta'] / 2) / da.bins.coords['Ltotal'] / 2).to(unit='angstrom')\n",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one doesn't have dspacing? Then it should be a different type.

@jokasimr jokasimr requested a review from jl-wynen August 21, 2025 15:31
powgen
```

## ESSbeer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We actually don't use the 'ESS' prefix in these headings in other packages. Can you change this to 'Beer' and also above to 'Dream'?

And please move this above the SNS section. Beer is more important.

max_distance_from_streak_line = mod_period / 3
sin_theta_L = sc.sin(da.bins.coords['two_theta'] / 2) * da.bins.coords['Ltotal']
t = da.bins.coords['event_time_offset']
for _ in range(15):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please make this a parameter.
Is there a way to check for convergence instead of blindly running 15 iterations? If we don't have any checks, how can we know whether the data is meaningful?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's an open question what the convergence check would be here.
I agree it would be good to have, but for now I think we should create an issue to remember it and leave it as follow-up work.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Making it a parameter is not a good idea. That gives the impression this is something the user should fiddle with. It's unlikely to help them and will probably just cause confusion and extra work for us.

If 15 is insufficient we should just increase it instead, but I haven't seen any indication that is the case.

'tof': _tof_from_dhkl,
'coarse_dhkl': _compute_d,
'theta': lambda two_theta: two_theta / 2,
'dhkl_list': lambda: dhkl_list,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you do this now or open an issue to track it?

jokasimr and others added 4 commits August 22, 2025 13:40
Co-authored-by: Jan-Lukas Wynen <jan-lukas.wynen@ess.eu>
Co-authored-by: Jan-Lukas Wynen <jan-lukas.wynen@ess.eu>
Co-authored-by: Jan-Lukas Wynen <jan-lukas.wynen@ess.eu>
@jokasimr
Copy link
Contributor Author

Can you do this now or open an issue to track it?

What do you mean? :) Do what now?

@jl-wynen
Copy link
Member

Can you do this now or open an issue to track it?

What do you mean? :) Do what now?

That was a reply to #184 (comment) Don't know why it showed up as a separate comment as well.

@jokasimr
Copy link
Contributor Author

That was a reply to #184 (comment) Don't know why it showed up as a separate comment as well.

Aha, I see. It's done now anyway

@jokasimr jokasimr enabled auto-merge (squash) September 4, 2025 07:40
@jokasimr jokasimr merged commit cdea781 into main Sep 4, 2025
4 checks passed
@jokasimr jokasimr deleted the beer branch September 4, 2025 07:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Requirement] Implement multiplicated reflection collapsing (BEER)

3 participants