Skip to content

#issue help wanted 8 reference for mask matrix of 1s #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bibuzz opened this issue Aug 2, 2021 · 2 comments
Open

#issue help wanted 8 reference for mask matrix of 1s #11

bibuzz opened this issue Aug 2, 2021 · 2 comments
Labels
help wanted Extra attention is needed

Comments

@bibuzz
Copy link

bibuzz commented Aug 2, 2021

@fsalv @EscVM
With reference to #8 (comment)
Matrix of how many ones should I set cropped_y_mask as ?

cropped_y_mask = tf.cast([1,1,1,1,1], tf.float32)

    cropped_predictions_masked = tf.cast(
        cropped_predictions, tf.float32)*cropped_y_mask
    cropped_labels_masked = cropped_labels*cropped_y_mask

    total_pixels_masked = tf.reduce_sum(cropped_y_mask, axis=[1, 2])

I am getting error
InvalidArgumentError: Invalid reduction dimension (1 for input with 1 dimension(s) [Op:Sum]
D:\SuperResolution\RAMS\utils\loss.py in l1_loss(y_true, y_pred, HR_SIZE)
50 cropped_labels_masked = cropped_labels*cropped_y_mask
51
---> 52 total_pixels_masked = tf.reduce_sum(cropped_y_mask, axis=[1, 2])
53
54 # bias brightness

I tried all combos
cropped_y_mask =[1,1,1,1,1]
cropped_y_mask =[1]
cropped_y_mask =[1,1]

FullError.txt
Any help appreciated

@fsalv
Copy link
Collaborator

fsalv commented Aug 10, 2021

cropped_y_mask should have the same dimension of the output, so B x HR_SIZE x HR_SIZE x 1. All the combos you tried are one dimensional, so they won't work. Try using np.ones.

@fsalv fsalv added the help wanted Extra attention is needed label Aug 10, 2021
@bibuzz
Copy link
Author

bibuzz commented Aug 12, 2021

Hi @fsalv @EscVM

I printed out the crop_prediction and the y_mask to see if dimensions same, and it does
show the same now (48, 90, 90, 1),
Like you said batch_sizehr_sizehr_size*1
(here my hr_size = 96, but some border pixel shift is done to make it 90 i guess
I did try the other option as well (48, 96, 96, 1) and it resulted in same error)

cropped_y_mask = np.ones(shape=(48, 90, 90, 1), dtype=np.float32)

I have attached the loss.py and the complete error

Output
------

Epoch 1/100
crop predictions tf.Tensor(
[[[[4612.5356 ]
[3837.1716 ]
[4633.922 ]
...
[2775.706 ]
[2106.822 ]
[2689.7336 ]]
...
[2897.199 ]
[2584.3728 ]
[2875.7666 ]]]], shape=(48, 90, 90, 1), dtype=float32)

crop y mask tf.Tensor(
[[[[1.]
[1.]
[1.]
...
[1.]
[1.]
[1.]]
...
[1.]]]], shape=(48, 90, 90, 1), dtype=float32)

Error
------

D:\SuperResolution\RAMS\utils\loss.py in l1_loss(y_true, y_pred, HR_SIZE)
48 print("crop predictions",tf.cast(cropped_predictions, tf.float32));
49 print("crop y mask", cropped_y_mask);
---> 50 cropped_predictions_masked = tf.cast(
51 cropped_predictions, tf.float32)cropped_y_mask
52 cropped_labels_masked = cropped_labels
cropped_y_mask

InvalidArgumentError: required broadcastable shapes at loc(unknown) [Op:Mul]
loss1.txt
outputwitherror.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants