-
Notifications
You must be signed in to change notification settings - Fork 20
Support for conditional sampling #344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
a simple conditional sampling implementation where the input to be conditioned on and the scopes to be marginalized can be specified.
|
Hi Isuru, Thanks again! |
Hi @loreloc , No problem at all, please take your time. Thank you. BR, |
|
Hi Isuru, I had a look to our PR and I think there is space for improvement. I think we should not cache tensors within the torch layer objects that are used only by the queries. This is because otherwise it would make de-allocating such tensors much more difficult, and it would also pollute the layer's attribute namespace with something that might not be needed by the user or by other queries. In general, we are following the design that layers do not know the layers they are connected to, and should not store the result of circuit computations themselves. Therefore, I think the conditional sampling implementation could be done by storing the intermediate outputs computed by each layer (i.e., the input to each subsequent layer) within the sampling query object. These tensors stored in a single place would then be freed upon the computation of the samples. In practice, this could be done by allowing the evaluate method in the torch computational graph to returns the outputs of each layer. Then, the conditional sampling routine could pass these samples to the sampling method of each layer. I hope this helps a bit. Let me know if you have any other questions. |
Hi @loreloc , Many thanks for the feedback! Give me some time to go through it in detail, and I will update the PR accordingly. Will revert back in case of questions. BR, |
|
Hi @loreloc , I incorporated the suggestions to the changes. Ended up creating a new class for conditional sampling as it seemed cleaner from the code perspective. However, I ended up exposing some functionality of the IntegrateQuery class to outside to be used in the conditional sampling. Let me know what you think. Thanks and regards, |
|
Hi Isuru, sorry for the delay on this! Could you please add a unit test for the conditional sampling? For instance, you could test it by comparing empirical frequencies of sampled values with probabilities computed by the circuit itself. E.g., see how the unconditional sampling is tested here: https://github.com/april-tools/cirkit/blob/main/tests/backend/torch/test_queries/test_sampling.py#L16-L51. |
Hi @loreloc, Apologies for the delay from my end as well. Will add some tests cases and update here. |
at the moment, conditional sampling works only for optimized circuits.
|
Hi @loreloc , I added a unit test. But, while writing the test case, I noted that conditional sampling fails for non-optimized circuits due to a dimension mismatch. If you think the current PR is good enough with this limitation, we can merge it. I will try to have a look at this issue in a coming day. Thank you. |
Conditional sampling implementation where the input to be conditioned on, and the scopes to be marginalized can be specified.