Skip to content

Conversation

@isurulucky
Copy link
Collaborator

@isurulucky isurulucky commented Jan 23, 2025

Conditional sampling implementation where the input to be conditioned on, and the scopes to be marginalized can be specified.

a simple conditional sampling implementation where the input to be conditioned on and the scopes to be marginalized can be specified.
@loreloc
Copy link
Member

loreloc commented Feb 11, 2025

Hi Isuru,
sorry for the late reply. Thanks a lot for submitting this PR!
However, I currently do not have much time to spend on cirkit. I might be able to have some time to look at your PR starting from the next week.

Thanks again!

@isurulucky
Copy link
Collaborator Author

Hi Isuru, sorry for the late reply. Thanks a lot for submitting this PR! However, I currently do not have much time to spend on cirkit. I might be able to have some time to look at your PR starting from the next week.

Thanks again!

Hi @loreloc ,

No problem at all, please take your time. Thank you.

BR,
Isuru

@loreloc
Copy link
Member

loreloc commented Mar 3, 2025

Hi Isuru,
thank again for working on this!

I had a look to our PR and I think there is space for improvement.

I think we should not cache tensors within the torch layer objects that are used only by the queries. This is because otherwise it would make de-allocating such tensors much more difficult, and it would also pollute the layer's attribute namespace with something that might not be needed by the user or by other queries. In general, we are following the design that layers do not know the layers they are connected to, and should not store the result of circuit computations themselves.

Therefore, I think the conditional sampling implementation could be done by storing the intermediate outputs computed by each layer (i.e., the input to each subsequent layer) within the sampling query object. These tensors stored in a single place would then be freed upon the computation of the samples.

In practice, this could be done by allowing the evaluate method in the torch computational graph to returns the outputs of each layer. Then, the conditional sampling routine could pass these samples to the sampling method of each layer.

I hope this helps a bit. Let me know if you have any other questions.

@isurulucky
Copy link
Collaborator Author

Hi Isuru, thank again for working on this!

I had a look to our PR and I think there is space for improvement.

I think we should not cache tensors within the torch layer objects that are used only by the queries. This is because otherwise it would make de-allocating such tensors much more difficult, and it would also pollute the layer's attribute namespace with something that might not be needed by the user or by other queries. In general, we are following the design that layers do not know the layers they are connected to, and should not store the result of circuit computations themselves.

Therefore, I think the conditional sampling implementation could be done by storing the intermediate outputs computed by each layer (i.e., the input to each subsequent layer) within the sampling query object. These tensors stored in a single place would then be freed upon the computation of the samples.

In practice, this could be done by allowing the evaluate method in the torch computational graph to returns the outputs of each layer. Then, the conditional sampling routine could pass these samples to the sampling method of each layer.

I hope this helps a bit. Let me know if you have any other questions.

Hi @loreloc ,

Many thanks for the feedback! Give me some time to go through it in detail, and I will update the PR accordingly. Will revert back in case of questions.

BR,
Isuru

@loreloc loreloc added this to the cirkit 0.3.0 (reckoning) milestone Mar 6, 2025
@isurulucky isurulucky marked this pull request as draft March 12, 2025 20:51
@isurulucky isurulucky marked this pull request as ready for review March 13, 2025 09:06
@isurulucky
Copy link
Collaborator Author

Hi @loreloc ,

I incorporated the suggestions to the changes. Ended up creating a new class for conditional sampling as it seemed cleaner from the code perspective. However, I ended up exposing some functionality of the IntegrateQuery class to outside to be used in the conditional sampling. Let me know what you think.

Thanks and regards,
Isuru

@loreloc
Copy link
Member

loreloc commented Apr 28, 2025

Hi Isuru, sorry for the delay on this!

Could you please add a unit test for the conditional sampling?

For instance, you could test it by comparing empirical frequencies of sampled values with probabilities computed by the circuit itself. E.g., see how the unconditional sampling is tested here: https://github.com/april-tools/cirkit/blob/main/tests/backend/torch/test_queries/test_sampling.py#L16-L51.

@isurulucky
Copy link
Collaborator Author

Hi Isuru, sorry for the delay on this!

Could you please add a unit test for the conditional sampling?

For instance, you could test it by comparing empirical frequencies of sampled values with probabilities computed by the circuit itself. E.g., see how the unconditional sampling is tested here: https://github.com/april-tools/cirkit/blob/main/tests/backend/torch/test_queries/test_sampling.py#L16-L51.

Hi @loreloc,

Apologies for the delay from my end as well. Will add some tests cases and update here.

at the moment, conditional sampling works only for optimized circuits.
@isurulucky
Copy link
Collaborator Author

isurulucky commented May 27, 2025

Hi @loreloc ,

I added a unit test. But, while writing the test case, I noted that conditional sampling fails for non-optimized circuits due to a dimension mismatch. If you think the current PR is good enough with this limitation, we can merge it. I will try to have a look at this issue in a coming day.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants