You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
cropped_y_mask should have the same dimension of the output, so B x HR_SIZE x HR_SIZE x 1. All the combos you tried are one dimensional, so they won't work. Try using np.ones.
I printed out the crop_prediction and the y_mask to see if dimensions same, and it does
show the same now (48, 90, 90, 1),
Like you said batch_sizehr_sizehr_size*1
(here my hr_size = 96, but some border pixel shift is done to make it 90 i guess
I did try the other option as well (48, 96, 96, 1) and it resulted in same error)
@fsalv @EscVM
With reference to #8 (comment)
Matrix of how many ones should I set cropped_y_mask as ?
cropped_y_mask = tf.cast([1,1,1,1,1], tf.float32)
I am getting error
InvalidArgumentError: Invalid reduction dimension (1 for input with 1 dimension(s) [Op:Sum]
D:\SuperResolution\RAMS\utils\loss.py in l1_loss(y_true, y_pred, HR_SIZE)
50 cropped_labels_masked = cropped_labels*cropped_y_mask
51
---> 52 total_pixels_masked = tf.reduce_sum(cropped_y_mask, axis=[1, 2])
53
54 # bias brightness
I tried all combos
cropped_y_mask =[1,1,1,1,1]
cropped_y_mask =[1]
cropped_y_mask =[1,1]
FullError.txt
Any help appreciated
The text was updated successfully, but these errors were encountered: