tree-optimization: [PR102622]: wrong code due to signed one bit integer and "a?-1:0"
So it turns out this is kinda of a latent bug but not really latent.
In GCC 9 and 10, phi-opt would transform a?-1:0 (even for signed 1-bit integer)
to -(type)a but the type is an one bit integer which means the negation is
undefined. GCC 11 fixed the problem by checking for a?pow2cst:0 transformation
before a?-1:0 transformation.
When I added the transformations to match.pd, I had swapped the order not paying
attention and I didn't expect anything of it. Because there was no testcase failing
due to this.
Anyways this fixes the problem on the trunk by swapping the order in match.pd and
adding a comment of why the order is this way.
I will try to come up with a patch for GCC 9 and 10 series later on which fixes
the problem there too.
Note I didn't include the original testcase which requires the vectorizer and AVX-512f
as I can't figure out the right dg options to restrict it to avx-512f but I did come up
with a testcase which shows the problem and even more shows the problem with the 9/10
series as mentioned.
OK? Bootstrapped and tested on x86_64-linux-gnu.
PR tree-optimization/102622
gcc/ChangeLog:
* match.pd: Swap the order of a?pow2cst:0 and a?-1:0 transformations.
Swap the order of a?0:pow2cst and a?0:-1 transformations.