[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[PATCH v2] tcg/optimize: optimize TSTNE using smask and zmask
From: |
Paolo Bonzini |
Subject: |
[PATCH v2] tcg/optimize: optimize TSTNE using smask and zmask |
Date: |
Fri, 24 Jan 2025 11:27:01 +0100 |
Generalize the existing optimization of "TSTNE x,sign" and "TSTNE x,-1".
This can be useful for example in the i386 frontend, which will generate
tests of zero-extended registers against 0xffffffff.
Ironically, on x86 hosts this is a very slight pessimization in the very
case it's meant to optimize because
brcond_i64 cc_dst,$0xffffffff,tsteq,$L1
(test %ebx, %ebx) is 1 byte smaller than
brcond_i64 cc_dst,$0x0,eq,$L1
(test %rbx, %rbx). However, in general it is an improvement, especially
if it avoids placing a large immediate in the constant pool.
Signed-off-by: Paolo Bonzini <pbonzini@redhat.com>
---
tcg/optimize.c | 21 ++++++++++++++++-----
1 file changed, 16 insertions(+), 5 deletions(-)
diff --git a/tcg/optimize.c b/tcg/optimize.c
index c23f0d13929..a509acf53fe 100644
--- a/tcg/optimize.c
+++ b/tcg/optimize.c
@@ -112,6 +112,13 @@ static inline bool arg_is_const_val(TCGArg arg, uint64_t
val)
return ts_is_const_val(arg_temp(arg), val);
}
+/* Calculate all the copies of the sign bit, both redundant and not. */
+static inline uint64_t all_sign_bit_copies(TCGType type, TempOptInfo *info)
+{
+ int64_t sign_bit = type == TCG_TYPE_I32 ? (int64_t)INT32_MIN : INT64_MIN;
+ return (info->s_mask >> 1) | sign_bit;
+}
+
static inline bool ts_is_copy(TCGTemp *ts)
{
return ts_info(ts)->next_copy != ts;
@@ -765,6 +772,7 @@ static int do_constant_folding_cond1(OptContext *ctx, TCGOp
*op, TCGArg dest,
TCGArg *p1, TCGArg *p2, TCGArg *pcond)
{
TCGCond cond;
+ TempOptInfo *i1;
bool swap;
int r;
@@ -782,19 +790,22 @@ static int do_constant_folding_cond1(OptContext *ctx,
TCGOp *op, TCGArg dest,
return -1;
}
+ i1 = arg_info(*p1);
+
/*
* TSTNE x,x -> NE x,0
- * TSTNE x,-1 -> NE x,0
+ * TSTNE x,i -> NE x,0 if i includes all nonzero bits of x
*/
- if (args_are_copies(*p1, *p2) || arg_is_const_val(*p2, -1)) {
+ if (args_are_copies(*p1, *p2) ||
+ (arg_is_const(*p2) && (i1->z_mask & ~arg_info(*p2)->val) == 0)) {
*p2 = arg_new_constant(ctx, 0);
*pcond = tcg_tst_eqne_cond(cond);
return -1;
}
- /* TSTNE x,sign -> LT x,0 */
- if (arg_is_const_val(*p2, (ctx->type == TCG_TYPE_I32
- ? INT32_MIN : INT64_MIN))) {
+ /* TSTNE x,i -> LT x,0 if i only includes sign bit copies */
+ if (arg_is_const(*p2) &&
+ (arg_info(*p2)->val & ~all_sign_bit_copies(ctx->type, i1)) == 0) {
*p2 = arg_new_constant(ctx, 0);
*pcond = tcg_tst_ltge_cond(cond);
return -1;
--
2.48.1
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [PATCH v2] tcg/optimize: optimize TSTNE using smask and zmask,
Paolo Bonzini <=