[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[PATCH v8 32/45] target/arm: Add arm_tlb_bti_gp
From: |
Richard Henderson |
Subject: |
[PATCH v8 32/45] target/arm: Add arm_tlb_bti_gp |
Date: |
Tue, 23 Jun 2020 12:36:45 -0700 |
Introduce an lvalue macro to wrap target_tlb_bit0.
Signed-off-by: Richard Henderson <richard.henderson@linaro.org>
---
target/arm/cpu.h | 13 +++++++++++++
target/arm/helper.c | 2 +-
target/arm/translate-a64.c | 2 +-
3 files changed, 15 insertions(+), 2 deletions(-)
diff --git a/target/arm/cpu.h b/target/arm/cpu.h
index a5d3b6c9ee..3121836bdc 100644
--- a/target/arm/cpu.h
+++ b/target/arm/cpu.h
@@ -3393,6 +3393,19 @@ static inline uint64_t *aa64_vfp_qreg(CPUARMState *env,
unsigned regno)
/* Shared between translate-sve.c and sve_helper.c. */
extern const uint64_t pred_esz_masks[4];
+/* Helper for the macros below, validating the argument type. */
+static inline MemTxAttrs *typecheck_memtxattrs(MemTxAttrs *x)
+{
+ return x;
+}
+
+/*
+ * Lvalue macros for ARM TLB bits that we must cache in the TCG TLB.
+ * Using these should be a bit more self-documenting than using the
+ * generic target bits directly.
+ */
+#define arm_tlb_bti_gp(x) (typecheck_memtxattrs(x)->target_tlb_bit0)
+
/*
* Naming convention for isar_feature functions:
* Functions which test 32-bit ID registers should have _aa32_ in
diff --git a/target/arm/helper.c b/target/arm/helper.c
index 33f902387b..44a3f9fb48 100644
--- a/target/arm/helper.c
+++ b/target/arm/helper.c
@@ -11079,7 +11079,7 @@ static bool get_phys_addr_lpae(CPUARMState *env,
target_ulong address,
}
/* When in aarch64 mode, and BTI is enabled, remember GP in the IOTLB. */
if (aarch64 && guarded && cpu_isar_feature(aa64_bti, cpu)) {
- txattrs->target_tlb_bit0 = true;
+ arm_tlb_bti_gp(txattrs) = true;
}
if (cacheattrs != NULL) {
diff --git a/target/arm/translate-a64.c b/target/arm/translate-a64.c
index 7e8263e86f..ec2295393d 100644
--- a/target/arm/translate-a64.c
+++ b/target/arm/translate-a64.c
@@ -14446,7 +14446,7 @@ static bool is_guarded_page(CPUARMState *env,
DisasContext *s)
* table entry even for that case.
*/
return (tlb_hit(entry->addr_code, addr) &&
- env_tlb(env)->d[mmu_idx].iotlb[index].attrs.target_tlb_bit0);
+ arm_tlb_bti_gp(&env_tlb(env)->d[mmu_idx].iotlb[index].attrs));
#endif
}
--
2.25.1
- [PATCH v8 26/45] target/arm: Implement helper_mte_checkN, (continued)
- [PATCH v8 26/45] target/arm: Implement helper_mte_checkN, Richard Henderson, 2020/06/23
- [PATCH v8 27/45] target/arm: Add helper_mte_check_zva, Richard Henderson, 2020/06/23
- [PATCH v8 28/45] target/arm: Use mte_checkN for sve unpredicated loads, Richard Henderson, 2020/06/23
- [PATCH v8 29/45] target/arm: Use mte_checkN for sve unpredicated stores, Richard Henderson, 2020/06/23
- [PATCH v8 30/45] target/arm: Use mte_check1 for sve LD1R, Richard Henderson, 2020/06/23
- [PATCH v8 31/45] target/arm: Tidy trans_LD1R_zpri, Richard Henderson, 2020/06/23
- [PATCH v8 32/45] target/arm: Add arm_tlb_bti_gp,
Richard Henderson <=
- [PATCH v8 34/45] target/arm: Add mte helpers for sve scalar + int stores, Richard Henderson, 2020/06/23
- [PATCH v8 33/45] target/arm: Add mte helpers for sve scalar + int loads, Richard Henderson, 2020/06/23
- [PATCH v8 35/45] target/arm: Add mte helpers for sve scalar + int ff/nf loads, Richard Henderson, 2020/06/23
- [PATCH v8 36/45] target/arm: Handle TBI for sve scalar + int memory ops, Richard Henderson, 2020/06/23