-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sketch out conj method #46
base: dev
Are you sure you want to change the base?
Conversation
if shared_indices.intersection(set(pair_node.get_all_dangling())): | ||
edge = tn.connect(node, pair_node) | ||
|
||
# TODO: TNAdapter should support tensornetwork.Node |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@danlkv this is my TODO for next week
scratchpad/tn_api/tn.py
Outdated
|
||
class Array(np.ndarray): | ||
shape: tuple[int] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed all instances of tuple[...
because was getting error that types can't have [
scratchpad/tn_api/tn.py
Outdated
|
||
# slice not inplace | ||
def slice(self, slice_dict: dict) -> 'TensorNetwork': | ||
... | ||
tn = self.copy() | ||
sliced_buckets = np_framework.get_sliced_np_buckets(self.buckets, self.data_dict, slice_dict) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i imagine data_dict
needs to be populated for this to be meaningful logic. is that something the callers should pass in? @danlkv
The tensor class shouldn't contain buckets, it would be duplicating
information, since we already have tensors property.
The slice method should take the dict and for every index find those
tensors that have the index. Then slice those tensors using the values in
dict, and replace the original ones.
Note also that the tensors are numpy arrays in this class
…On Sat, Oct 21, 2023, 5:38 AM Dallon Asnes ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In scratchpad/tn_api/tn.py
<#46 (comment)>:
>
# slice not inplace
def slice(self, slice_dict: dict) -> 'TensorNetwork':
- ...
+ tn = self.copy()
+ sliced_buckets = np_framework.get_sliced_np_buckets(self.buckets, self.data_dict, slice_dict)
i imagine data_dict needs to be populated for this to be meaningful
logic. is that something the callers should pass in? @danlkv
<https://github.com/danlkv>
—
Reply to this email directly, view it on GitHub
<#46 (review)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADET2QO3G5X3AMHZEE5BB3LYAOQ2RAVCNFSM6AAAAAA5ZOEYEGVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTMOJRGIZTEOJYGI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
scratchpad/tn_api/tn.py
Outdated
|
||
# slice not inplace | ||
def slice(self, slice_dict: dict) -> 'TensorNetwork': | ||
tn = self.copy() | ||
sliced_buckets = np_framework.get_sliced_np_buckets(self.buckets, self.data_dict, slice_dict) | ||
tn.buckets = sliced_buckets | ||
sliced_tns = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@danlkv i think this implementation is better, wdyt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this is much better. However, we need a way to refer to tensor indices globally. For example, what would happen if using this code we sliced a tensor with 3 tensors of shape (2, 2) and slice dict {1: 0}? The answer is that all tensors will be sliced along the 2nd dimension, but we want a behavior that is global to all TN
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In addition, you have to consider how the edges change as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dallonasnes Here's a sketch of what can be a solution.
First, let me clear out something about the edges structure. Edges in the edge array represent delta-tensors, which make all indices that are connected to them the same. For example for expression T_{abc}\delta_{abd} B_{edg}C_{mp} the equivalent expression is T_{xxc } B_{exg} C_{mp}. Note that you can have a one-to-one correspondence between each delta and an index (in the example index x).
Let's assume that all indices in our TN are represented by a delta tensor and its corresponding tuple in the edges attribute (This may seem excessive, but we can find overcome the overhead later). Now, a slice dict may refer to the element in the edges attribute by using just index of the edge.
So the slice dict is {index in edge dict: index value}
The algorithm would be
1 For each index, value pair of slice dict
- Pop the edge from edges attr using the index
- Get all tensors that are indexed by this edge (in the example above that would be T and B). Use Port.tensor_ix for that.
- Using the information in ports in edges, construct the local (wrt each tensor) slicing dict for each tensor
- Slice and update the tensor.
Note that after slicing the edge disappears.
self._indices = {} | ||
|
||
# slice not inplace | ||
def slice(self, slice_dict: dict) -> 'TensorNetwork': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@danlkv thanks for the previous comment, i believed i've addressed that in this update
scratchpad/tn_api/tn.py
Outdated
|
||
# contract to produce a new tensor | ||
def contract(self, contraction_info: ContractionInfo) -> np.ndarray: | ||
raise NotImplementedError() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
contraction_info here is indices i'd want in the output
einsum
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
test case is contracted A = contraction of sliceA[0] and sliceA[1]. this is true for any index so can select it at random
sliced_tensor = tn._tensors[current_tensor_ref][tuple(slice_bounds)] | ||
tn._tensors[current_tensor_ref] = sliced_tensor | ||
|
||
# TODO: this is just a guess - but i am adding the ports from the popped edge back to the list of slices |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@danlkv where we left off last time is that calling einsum is failing on a sliced tn. it's likely because slicing the tn removes all edges - i don't think that behavior is intended, so my proposal for a fix is that in this section of logic (starting at line 76) when we pop an edge out, at some point we need to add it back.
i've added some pdb loops so that when we err out, we can repro it in the same run
here's an example error output
aa,ba->ab
[(4, 4, 2), (4, 4, 2)]
((Port(tensor_ref=0, ix=0), Port(tensor_ref=0, ix=1), Port(tensor_ref=1, ix=1)), (Port(tensor_ref=1, ix=0),))
2
operand has more dimensions than subscripts given in einstein sum, but no '...' ellipsis provided to broadcast the extra dimensions.
[1] > /app/scratchpad/tn_api/tn.py(172)contract()
-> keep_going = True
(Pdb++) c
Traceback (most recent call last):
File "tn_api/tn.py", line 168, in contract
return np.einsum(einsum_expr, *self._tensors)
File "<__array_function__ internals>", line 200, in einsum
File "/usr/local/lib/python3.8/dist-packages/numpy/core/einsumfunc.py", line 1371, in einsum
return c_einsum(*operands, **kwargs)
ValueError: operand has more dimensions than subscripts given in einstein sum, but no '...' ellipsis provided to broadcast the extra dimensions.
Beginning of conj method