]> git.ipfire.org Git - thirdparty/binutils-gdb.git/commitdiff
RISC-V/rvv: Add rvv v0.10 instructions.
authorNelson Chu <nelson.chu@sifive.com>
Fri, 19 Mar 2021 09:19:11 +0000 (17:19 +0800)
committerNelson Chu <nelson.chu@sifive.com>
Thu, 28 Oct 2021 00:50:29 +0000 (08:50 +0800)
2021-03-30  Jim Wilson  <jimw@sifive.com>
            Kito Cheng  <kito.cheng@sifive.com>
            Nelson Chu  <nelson.chu@sifive.com>

This patch is porting from the following riscv github,
https://github.com/riscv/riscv-binutils-gdb/tree/rvv-1.0.x

And here is the vector draft spec,
https://github.com/riscv/riscv-v-spec

The match_func in opcodes/riscv-opc.c have many purposes.  One of them is
checking the instruction constraints.  But we got the request before that
the assembler constraint checkings break some hardware exception testcases,
which are written by assmebly code.  Therefore, we add new assembler options
and .option directives to let users can disable/enable the rvv constraints.
For now the constraints are disabled by default, but should we default
enable them for safety?  Besides, the match_func will return different
error constriant messages, so that we can report the details to users.
This should be more user-friendly.

bfd/
* elfxx-riscv.c (riscv_supported_std_ext): Updated the draft
version of v.
(riscv_supported_std_z_ext): Added draft zvamo and zvlsseg.
gas/
* config/tc-riscv.c (enum DRAFT_EXT): Added.
(enum riscv_extended_csr_class): Added CSR_CLASS_V for rvv CSRs.
(enum reg_extended_class): Added vector registers.
(op_draft_hash): Added draft hash table for rvv.
(md_begin): Init op_draft_hash and register hash for rvv.
(riscv_extended_subset_supports): Handle INSN_CLASS_V*.
(riscv_extended_csr_class_check): Handle CSR_CLASS_V.
(validate_riscv_extended_insn): Check if the rvv instructions are valid.
(riscv_find_extended_opcode_hash): Search instruction opcode from
op_draft_hash.
(vector_macro): Call macro_build to expand rvv macros into instructions.
(extended_macro_build): Handle rvv operands for macro_build.
(extended_macro): Handle M_VMSGE and M_VMSGEU.
(my_getVsetvliExpression): Similar to my_getVsetvliExpression, but used
for parsing vsetvli operands.
(riscv_parse_extended_operands): Handle rvv operands.  Pass &regno from
riscv_ip, otherwise we will get fail when parsing Vf operand for AMO VS3.
(riscv_ip): Add two new arguments to match_func, check_constraints and
&error.  We can disbale the match_func check by setting check_constraints
to zero; The part of match_func will set different error messages to the
&error, and them we can report more details to users.
(riscv_set_options, riscv_opts, s_riscv_option):  Add .option
checkconstraints and nocheckconstraints, to enable/disable the
match_func constraints checking.  Disable it by default.
(enum options, md_longopts, md_parse_option): Add assembler options
m[no-]check-constraints.
* testsuite/gas/riscv/extended/extended.exp: Updated.
* testsuite/gas/riscv/extended/extended-csr.d: New testcase for rvv CSRs.
* testsuite/gas/riscv/extended/extended-csr.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.d:
New testcase for rvv constriants.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-int.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-int.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-int.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-load-store.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-load-store.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-load-store.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-mask.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-mask.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-mask.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-permutation.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-permutation.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-permutation.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvamo.l: Likewise.
* testsuite/gas/riscv/extended/vector-insns-fail-zvamo.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-vmsgtvx.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns-zero-imm.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns-zero-imm.s: Likewise.
* testsuite/gas/riscv/extended/vector-insns.d: Likewise.
* testsuite/gas/riscv/extended/vector-insns.s: Likewise.
include/
* opcode/riscv-opc-extended.h: Added rvv encoding macros and CSRs.
* opcode/riscv.h: Added rvv immediate encodings and fields.
(struct riscv_opcode): Updated match_func.
(enum riscv_extended_insn_class): Added INSN_CLASS_V*.
(enum M_VMSGE, M_VMSGEU): Added.
opcodes/
* riscv-dis.c (print_extended_insn_args): Handle rvv operands.
(riscv_disassemble_opcode): Updated match_func.
* riscv-opc.c (match_*): Updated since two new parameters.
(riscv_vecr_names_numeric): Added rvv register names.
(riscv_vecm_names_numeric): Added rvv mask register name.
(riscv_vsew, riscv_vlmul, riscv_vta, riscv_vma): Added for vsetvli.
(MASK_VD, MASK_VS1, MASK_VS2, MASK_VMASK): Added for rvv match_func.
(match_vs1_eq_vs2, match_vs1_eq_vs2_neq_vm, match_vd_eq_vs1_eq_vs2):
Added to check special register usage, cannot be disabled.
(match_widen_vd_neq_vs1_neq_vs2_neq_vm): The rvv constraint check,
can be disabled/enabled by m[no-]check-constraints or .option
[no]checkconstraints.
(match_widen_vd_neq_vs1_neq_vm): Likewise.
(match_widen_vd_neq_vs2_neq_vm): Likewise.
(match_widen_vd_neq_vm): Likewise.
(match_narrow_vd_neq_vs2_neq_vm): Likewise.
(match_vd_neq_vs1_neq_vs2): Likewise.
(match_vd_neq_vs1_neq_vs2_neq_vm): Likewise.
(match_vd_neq_vs2_neq_vm): Likewise.
(match_vd_neq_vm): Likewise.
(match_vls_nf_rv): Likewise.
(match_vmv_nf_rv): Likewise.
(riscv_draft_opcodes): Added rvv v0.10 instructions.
(riscv_extended_opcodes): Updated.

42 files changed:
bfd/elfxx-riscv.c
gas/config/tc-riscv.c
gas/testsuite/gas/riscv/extended/extended-csr.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/extended-csr.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/extended.exp
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.l [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.s [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns.d [new file with mode: 0644]
gas/testsuite/gas/riscv/extended/vector-insns.s [new file with mode: 0644]
include/opcode/riscv-opc-extended.h
include/opcode/riscv.h
opcodes/riscv-dis.c
opcodes/riscv-opc.c

index cdb4fa0996ac92b28d8ced7d2847297109ebb5a3..34fd0e3b7d27e1e8c5e146441259cd560dae6d33 100644 (file)
@@ -1130,7 +1130,7 @@ static struct riscv_supported_ext riscv_supported_std_ext[] =
   {"j",                ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
   {"t",                ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
   {"p",                ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
-  {"v",                ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
+  {"v",                ISA_SPEC_CLASS_DRAFT,           0, 10, 0 },             /* draft.  */
   {"n",                ISA_SPEC_CLASS_NONE, RISCV_UNKNOWN_VERSION, RISCV_UNKNOWN_VERSION, 0 },
   {NULL, 0, 0, 0, 0}
 };
@@ -1146,6 +1146,8 @@ static struct riscv_supported_ext riscv_supported_std_z_ext[] =
   {"zba",              ISA_SPEC_CLASS_DRAFT,           1, 0,  0 },
   {"zbc",              ISA_SPEC_CLASS_DRAFT,           1, 0,  0 },
   {"zbs",               ISA_SPEC_CLASS_DRAFT,           1, 0,  0 },
+  {"zvamo",            ISA_SPEC_CLASS_DRAFT,           0, 10, 0 },     /* draft.  */
+  {"zvlsseg",          ISA_SPEC_CLASS_DRAFT,           0, 10, 0 },     /* draft.  */
   {NULL, 0, 0, 0, 0}
 };
 
index cafdda4d796bd99bf7f73c07d6f5129d5063b1ea..6ddb710282090d8f051d40e052d3a35ae8f344af 100644 (file)
@@ -38,7 +38,8 @@
 /* Used to choose the right opcode hashes for extended extensions.  */
 enum
 {
-  EXTENDED_EXT_NUM = 0
+  DRAFT_EXT = 0,
+  EXTENDED_EXT_NUM
 };
 
 /* Information about an instruction, including its format, operands
@@ -73,6 +74,12 @@ enum riscv_csr_class
   CSR_CLASS_EXTENDED /* Extended CSR  */
 };
 
+/* All RISC-V extended CSR belong to one of these classes.  */
+enum riscv_extended_csr_class
+{
+  CSR_CLASS_V = CSR_CLASS_EXTENDED, /* RVV CSR */
+};
+
 /* This structure holds all restricted conditions for a CSR.  */
 struct riscv_csr_extra
 {
@@ -214,6 +221,7 @@ struct riscv_set_options
   int relax; /* Emit relocs the linker is allowed to relax.  */
   int arch_attr; /* Emit architecture and privileged elf attributes.  */
   int csr_check; /* Enable the CSR checking.  */
+  int check_constraints; /* Enable/disable the match_func checking.  */
 };
 
 static struct riscv_set_options riscv_opts =
@@ -224,6 +232,7 @@ static struct riscv_set_options riscv_opts =
   1, /* relax */
   DEFAULT_RISCV_ATTR, /* arch_attr */
   0, /* csr_check */
+  0, /* check_constraints */
 };
 
 static void
@@ -262,6 +271,16 @@ riscv_extended_subset_supports (int insn_class)
 {
   switch (insn_class)
     {
+    case INSN_CLASS_V: return riscv_subset_supports ("v");
+    case INSN_CLASS_V_AND_F:
+      return riscv_subset_supports ("v") && riscv_subset_supports ("f");
+    case INSN_CLASS_V_OR_ZVAMO:
+      return (riscv_subset_supports ("a")
+             && (riscv_subset_supports ("v")
+                 || riscv_subset_supports ("zvamo")));
+    case INSN_CLASS_V_OR_ZVLSSEG:
+      return (riscv_subset_supports ("v")
+             || riscv_subset_supports ("zvlsseg"));
     default:
       as_fatal ("internal: unknown INSN_CLASS (0x%x)", insn_class);
       return false;
@@ -407,6 +426,9 @@ riscv_set_abi_by_arch (void)
 /* Handle of the OPCODE hash table.  */
 static htab_t op_hash = NULL;
 
+/* Handle of the draft OPCODE hash table.  */
+static htab_t op_draft_hash = NULL;
+
 /* Handle of the type of .insn hash table.  */
 static htab_t insn_type_hash = NULL;
 
@@ -848,6 +870,8 @@ opcode_name_lookup (char **s)
 /* Extended registers.  */
 enum reg_extended_class
 {
+  RCLASS_VECR,
+  RCLASS_VECM,
   RCLASS_EXTENDED_NUM,
 };
 
@@ -934,6 +958,10 @@ riscv_extended_csr_class_check (int csr_class)
 {
   switch (csr_class)
     {
+    case CSR_CLASS_V:
+      return (riscv_subset_supports ("v")
+             || riscv_subset_supports ("zvamo")
+             || riscv_subset_supports ("zvlsseg"));
     default:
       as_bad (_("internal: bad RISC-V CSR class (0x%x)"), csr_class);
     }
@@ -1102,6 +1130,30 @@ validate_riscv_extended_insn (insn_t *bits,
 
   switch (*oparg)
     {
+    case 'V': /* RVV */
+      switch (*++oparg)
+       {
+       case 'd':
+       case 'f': USE_BITS (OP_MASK_VD, OP_SH_VD); break;
+       case 'e': USE_BITS (OP_MASK_VWD, OP_SH_VWD); break;
+       case 's': USE_BITS (OP_MASK_VS1, OP_SH_VS1); break;
+       case 't': USE_BITS (OP_MASK_VS2, OP_SH_VS2); break;
+       case 'u': USE_BITS (OP_MASK_VS1, OP_SH_VS1);
+                 USE_BITS (OP_MASK_VS2, OP_SH_VS2); break;
+       case 'v': USE_BITS (OP_MASK_VD, OP_SH_VD);
+                 USE_BITS (OP_MASK_VS1, OP_SH_VS1);
+                 USE_BITS (OP_MASK_VS2, OP_SH_VS2); break;
+       case '0': break;
+       case 'b': used_bits |= ENCODE_RVV_VB_IMM (-1U); break;
+       case 'c': used_bits |= ENCODE_RVV_VC_IMM (-1U); break;
+       case 'i':
+       case 'j':
+       case 'k': USE_BITS (OP_MASK_VIMM, OP_SH_VIMM); break;
+       case 'm': USE_BITS (OP_MASK_VMASK, OP_SH_VMASK); break;
+       default:
+         return false;
+       }
+      break;
     default:
       return false;
     }
@@ -1350,6 +1402,11 @@ md_begin (void)
 
   /* Set the default alignment for the text section.  */
   record_alignment (text_section, riscv_opts.rvc ? 1 : 2);
+
+  /* Extended settings.  */
+  hash_reg_names (RCLASS_VECR, riscv_vecr_names_numeric, NVECR);
+  hash_reg_names (RCLASS_VECM, riscv_vecm_names_numeric, NVECM);
+  op_draft_hash = init_opcode_hash (riscv_extended_opcodes[DRAFT_EXT], false);
 }
 
 static insn_t
@@ -1455,6 +1512,9 @@ riscv_find_extended_opcode_hash (char *str ATTRIBUTE_UNUSED)
 
       switch (i)
        {
+       case DRAFT_EXT:
+         insn = (struct riscv_opcode *) str_hash_find (op_draft_hash, str);
+         break;
        default:
          break;
        }
@@ -1483,13 +1543,45 @@ riscv_find_opcode_hash (char *str, htab_t hash)
 static bool
 extended_macro_build (struct riscv_cl_insn* insn_p,
                      const char **fmt_p,
-                     va_list args ATTRIBUTE_UNUSED)
+                     va_list args)
 {
   struct riscv_cl_insn insn = *insn_p;
   const char *fmt = *fmt_p;
 
   switch (*fmt)
     {
+    case 'V': /* RVV */
+      switch (*++fmt)
+       {
+       case 'd':
+         INSERT_OPERAND (VD, insn, va_arg (args, int));
+         break;
+
+       case 's':
+         INSERT_OPERAND (VS1, insn, va_arg (args, int));
+         break;
+
+       case 't':
+         INSERT_OPERAND (VS2, insn, va_arg (args, int));
+         break;
+
+       case 'm':
+         {
+           int reg = va_arg (args, int);
+           if (reg == -1)
+             INSERT_OPERAND (VMASK, insn, 1);
+           else if (reg == 0)
+             INSERT_OPERAND (VMASK, insn, 0);
+           else
+             return false;
+         }
+         break;
+
+       default:
+         return false;
+       }
+      break;
+
     default:
       return false;
     }
@@ -1746,16 +1838,110 @@ riscv_ext (int destreg, int srcreg, unsigned shift, bool sign)
     }
 }
 
+/* Expand RISC-V Vector macros into one or more instructions.  */
+
+static void
+vector_macro (struct riscv_cl_insn *ip)
+{
+  int vd = (ip->insn_opcode >> OP_SH_VD) & OP_MASK_VD;
+  int vs1 = (ip->insn_opcode >> OP_SH_VS1) & OP_MASK_VS1;
+  int vs2 = (ip->insn_opcode >> OP_SH_VS2) & OP_MASK_VS2;
+  int vm = (ip->insn_opcode >> OP_SH_VMASK) & OP_MASK_VMASK;
+  int vtemp = (ip->insn_opcode >> OP_SH_VFUNCT6) & OP_MASK_VFUNCT6;
+  int mask = ip->insn_mo->mask;
+
+  switch (mask)
+    {
+    case M_VMSGE:
+      if (vm)
+       {
+         /* Unmasked.  */
+         macro_build (NULL, "vmslt.vx", "Vd,Vt,sVm", vd, vs2, vs1, -1);
+         macro_build (NULL, "vmnand.mm", "Vd,Vt,Vs", vd, vd, vd);
+         break;
+       }
+      if (vtemp != 0)
+       {
+         /* Masked.  Have vtemp to avoid overlap constraints.  */
+         if (vd == vm)
+           {
+             macro_build (NULL, "vmslt.vx", "Vd,Vt,s", vtemp, vs2, vs1);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vd, vm, vtemp);
+           }
+         else
+           {
+             /* Preserve the value of vd if not updating by vm.  */
+             macro_build (NULL, "vmslt.vx", "Vd,Vt,s", vtemp, vs2, vs1);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vtemp, vm, vtemp);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vd, vd, vm);
+             macro_build (NULL, "vmor.mm", "Vd,Vt,Vs", vd, vtemp, vd);
+           }
+       }
+      else if (vd != vm)
+       {
+         /* Masked.  This may cause the vd overlaps vs2, when LMUL > 1.  */
+         macro_build (NULL, "vmslt.vx", "Vd,Vt,sVm", vd, vs2, vs1, vm);
+         macro_build (NULL, "vmxor.mm", "Vd,Vt,Vs", vd, vd, vm);
+       }
+      else
+       as_bad (_("must provide temp if destination overlaps mask"));
+      break;
+
+    case M_VMSGEU:
+      if (vm)
+       {
+         /* Unmasked.  */
+         macro_build (NULL, "vmsltu.vx", "Vd,Vt,sVm", vd, vs2, vs1, -1);
+         macro_build (NULL, "vmnand.mm", "Vd,Vt,Vs", vd, vd, vd);
+         break;
+       }
+      if (vtemp != 0)
+       {
+         /* Masked.  Have vtemp to avoid overlap constraints.  */
+         if (vd == vm)
+           {
+             macro_build (NULL, "vmsltu.vx", "Vd,Vt,s", vtemp, vs2, vs1);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vd, vm, vtemp);
+           }
+         else
+           {
+             /* Preserve the value of vd if not updating by vm.  */
+             macro_build (NULL, "vmsltu.vx", "Vd,Vt,s", vtemp, vs2, vs1);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vtemp, vm, vtemp);
+             macro_build (NULL, "vmandnot.mm", "Vd,Vt,Vs", vd, vd, vm);
+             macro_build (NULL, "vmor.mm", "Vd,Vt,Vs", vd, vtemp, vd);
+           }
+       }
+      else if (vd != vm)
+       {
+         /* Masked.  This may cause the vd overlaps vs2, when LMUL > 1.  */
+         macro_build (NULL, "vmsltu.vx", "Vd,Vt,sVm", vd, vs2, vs1, vm);
+         macro_build (NULL, "vmxor.mm", "Vd,Vt,Vs", vd, vd, vm);
+       }
+      else
+       as_bad (_("must provide temp if destination overlaps mask"));
+      break;
+
+    default:
+      break;
+    }
+}
+
 /* Expand RISC-V extended assembly macros into one or more instructions.  */
 
 static bool
-extended_macro (struct riscv_cl_insn *ip ATTRIBUTE_UNUSED,
+extended_macro (struct riscv_cl_insn *ip,
                int mask,
                expressionS *imm_expr ATTRIBUTE_UNUSED,
                bfd_reloc_code_real_type *imm_reloc ATTRIBUTE_UNUSED)
 {
   switch (mask)
     {
+    case M_VMSGE:
+    case M_VMSGEU:
+      vector_macro (ip);
+      break;
+
     default:
       return false;
     }
@@ -2165,6 +2351,66 @@ riscv_is_priv_insn (insn_t insn)
          || ((insn ^ MATCH_SFENCE_VM) & MASK_SFENCE_VM) == 0);
 }
 
+/* Parse string STR as a vsetvli operand.  Store the expression in *EP.
+   On exit, EXPR_END points to the first character after the expression.  */
+
+static void
+my_getVsetvliExpression (expressionS *ep, char *str)
+{
+  unsigned int vsew_value = 0, vlmul_value = 0;
+  unsigned int vta_value = 0, vma_value = 0;
+  bool vsew_found = false, vlmul_found = false;
+  bool vta_found = false, vma_found = false;
+
+  if (arg_lookup (&str, riscv_vsew, ARRAY_SIZE (riscv_vsew), &vsew_value))
+    {
+      if (*str == ',')
+       ++str;
+      if (vsew_found)
+       as_bad (_("multiple vsew constants"));
+      vsew_found = true;
+    }
+  if (arg_lookup (&str, riscv_vlmul, ARRAY_SIZE (riscv_vlmul), &vlmul_value))
+    {
+      if (*str == ',')
+       ++str;
+      if (vlmul_found)
+       as_bad (_("multiple vlmul constants"));
+      vlmul_found = true;
+    }
+  if (arg_lookup (&str, riscv_vta, ARRAY_SIZE (riscv_vta), &vta_value))
+    {
+      if (*str == ',')
+       ++str;
+      if (vta_found)
+       as_bad (_("multiple vta constants"));
+      vta_found = true;
+    }
+  if (arg_lookup (&str, riscv_vma, ARRAY_SIZE (riscv_vma), &vma_value))
+    {
+      if (*str == ',')
+       ++str;
+      if (vma_found)
+       as_bad (_("multiple vma constants"));
+      vma_found = true;
+    }
+
+  if (vsew_found || vlmul_found || vta_found || vma_found)
+    {
+      ep->X_op = O_constant;
+      ep->X_add_number = (vlmul_value << OP_SH_VLMUL)
+                         | (vsew_value << OP_SH_VSEW)
+                         | (vta_value << OP_SH_VTA)
+                         | (vma_value << OP_SH_VMA);
+      expr_end = str;
+    }
+  else
+    {
+      my_getExpression (ep, str);
+      str = expr_end;
+    }
+}
+
 /* Parse all extended operands for riscv_ip.  */
 
 static bool
@@ -2172,13 +2418,192 @@ riscv_parse_extended_operands (struct riscv_cl_insn *ip ATTRIBUTE_UNUSED,
                               expressionS *imm_expr ATTRIBUTE_UNUSED,
                               bfd_reloc_code_real_type *imm_reloc ATTRIBUTE_UNUSED,
                               const char **opcode_args,
-                              char **assembly_args)
+                              char **assembly_args,
+                              unsigned int *regno_p)
 {
   const char *oparg = *opcode_args;
   char *asarg = *assembly_args;
+  unsigned int regno = *regno_p;
 
   switch (*oparg)
     {
+    case 'V': /* RVV */
+      switch (*++oparg)
+       {
+       case 'd': /* VD */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         INSERT_OPERAND (VD, *ip, regno);
+         break;
+
+       case 'e': /* AMO VD */
+         if (reg_lookup (&asarg, RCLASS_GPR, &regno) && regno == 0)
+           INSERT_OPERAND (VWD, *ip, 0);
+         else if (reg_lookup (&asarg, RCLASS_VECR, &regno))
+           {
+             INSERT_OPERAND (VWD, *ip, 1);
+             INSERT_OPERAND (VD, *ip, regno);
+           }
+         else
+           return false;
+         break;
+
+       case 'f': /* AMO VS3 */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         if (!EXTRACT_OPERAND (VWD, ip->insn_opcode))
+           INSERT_OPERAND (VD, *ip, regno);
+         else
+           {
+             /* VS3 must match VD.  */
+             if (EXTRACT_OPERAND (VD, ip->insn_opcode) != regno)
+               return false;
+           }
+         break;
+
+       case 's': /* VS1 */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         INSERT_OPERAND (VS1, *ip, regno);
+         break;
+
+       case 't': /* VS2 */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         INSERT_OPERAND (VS2, *ip, regno);
+         break;
+
+       case 'u': /* VS1 == VS2 */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         INSERT_OPERAND (VS1, *ip, regno);
+         INSERT_OPERAND (VS2, *ip, regno);
+         break;
+
+       case 'v': /* VD == VS1 == VS2 */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno))
+           return false;
+         INSERT_OPERAND (VD, *ip, regno);
+         INSERT_OPERAND (VS1, *ip, regno);
+         INSERT_OPERAND (VS2, *ip, regno);
+         break;
+
+       /* The `V0` is carry-in register for v[m]adc and v[m]sbc,
+          and is used to choose vs1/rs1/frs1/imm or vs2 for
+          v[f]merge.  It uses the same encoding as the vector mask
+          register.  */
+       case '0':
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno) || regno != 0)
+           return false;
+         break;
+
+       case 'b': /* vtypei for vsetivli */
+         my_getVsetvliExpression (imm_expr, asarg);
+         check_absolute_expr (ip, imm_expr, false);
+         if (!VALID_RVV_VB_IMM (imm_expr->X_add_number))
+           {
+             as_bad (_("bad value for vsetivli immediate field, "
+                       "value must be 0..1023"));
+             return false;
+           }
+         ip->insn_opcode
+           |= ENCODE_RVV_VB_IMM (imm_expr->X_add_number);
+         imm_expr->X_op = O_absent;
+         asarg = expr_end;
+         break;
+
+       case 'c': /* vtypei for vsetvli */
+         my_getVsetvliExpression (imm_expr, asarg);
+         check_absolute_expr (ip, imm_expr, false);
+         if (!VALID_RVV_VC_IMM (imm_expr->X_add_number))
+           {
+             as_bad (_("bad value for vsetvli immediate field, "
+                       "value must be 0..2047"));
+             return false;
+           }
+         ip->insn_opcode
+           |= ENCODE_RVV_VC_IMM (imm_expr->X_add_number);
+         imm_expr->X_op = O_absent;
+         asarg = expr_end;
+         break;
+
+       case 'i': /* vector arith signed immediate */
+         my_getExpression (imm_expr, asarg);
+         check_absolute_expr (ip, imm_expr, false);
+         if (imm_expr->X_add_number > 15
+             || imm_expr->X_add_number < -16)
+           {
+             as_bad (_("bad value for vector immediate field, "
+                       "value must be -16...15"));
+             return false;
+           }
+         INSERT_OPERAND (VIMM, *ip, imm_expr->X_add_number);
+         imm_expr->X_op = O_absent;
+         asarg = expr_end;
+         break;
+
+       case 'j': /* vector arith unsigned immediate */
+         my_getExpression (imm_expr, asarg);
+         check_absolute_expr (ip, imm_expr, false);
+         if (imm_expr->X_add_number < 0
+             || imm_expr->X_add_number >= 32)
+           {
+             as_bad (_("bad value for vector immediate field, "
+                       "value must be 0...31"));
+             return false;
+           }
+         INSERT_OPERAND (VIMM, *ip, imm_expr->X_add_number);
+         imm_expr->X_op = O_absent;
+         asarg = expr_end;
+         break;
+
+       case 'k': /* vector arith signed immediate, minus 1 */
+         my_getExpression (imm_expr, asarg);
+         check_absolute_expr (ip, imm_expr, false);
+         if (imm_expr->X_add_number > 16
+             || imm_expr->X_add_number < -15)
+           {
+             as_bad (_("bad value for vector immediate field, "
+                       "value must be -15...16"));
+             return false;
+           }
+         INSERT_OPERAND (VIMM, *ip, imm_expr->X_add_number - 1);
+         imm_expr->X_op = O_absent;
+         asarg = expr_end;
+         break;
+
+       case 'm': /* optional vector mask */
+         if (*asarg == '\0')
+           INSERT_OPERAND (VMASK, *ip, 1);
+         else if (*asarg == ',' && asarg++
+                  && reg_lookup (&asarg, RCLASS_VECM, &regno)
+                  && regno == 0)
+           INSERT_OPERAND (VMASK, *ip, 0);
+         else
+           return false;
+         break;
+
+       /* The following ones are only used in macros.  */
+       case 'M': /* required vector mask */
+         if (reg_lookup (&asarg, RCLASS_VECM, &regno) && regno == 0)
+           INSERT_OPERAND (VMASK, *ip, 0);
+         else
+           return false;
+         break;
+
+       case 'T': /* vector macro temporary register */
+         if (!reg_lookup (&asarg, RCLASS_VECR, &regno) || regno == 0)
+           return false;
+         /* Store it in the FUNCT6 field as we don't have anyplace
+            else to store it.  */
+         INSERT_OPERAND (VFUNCT6, *ip, regno);
+         break;
+
+       default:
+         return false;
+       }
+      break;
+
     default:
       as_fatal (_("internal: unknown argument type `%s'"),
                *opcode_args);
@@ -2186,6 +2611,7 @@ riscv_parse_extended_operands (struct riscv_cl_insn *ip ATTRIBUTE_UNUSED,
     }
 
   *opcode_args = oparg;
+  *regno_p = regno;
   *assembly_args = asarg;
   return true;
 }
@@ -2230,6 +2656,8 @@ riscv_ip (char *str, struct riscv_cl_insn *ip, expressionS *imm_expr,
       if (!riscv_multi_subset_supports (insn->insn_class))
        continue;
 
+      /* Reset error message of the previous round.  */
+      error = _("illegal operands");
       create_insn (ip, insn);
       argnum = 1;
 
@@ -2246,7 +2674,9 @@ riscv_ip (char *str, struct riscv_cl_insn *ip, expressionS *imm_expr,
            case '\0': /* End of args.  */
              if (insn->pinfo != INSN_MACRO)
                {
-                 if (!insn->match_func (insn, ip->insn_opcode))
+                 if (!insn->match_func (insn, ip->insn_opcode,
+                                        riscv_opts.check_constraints,
+                                        &error))
                    break;
 
                  /* For .insn, insn->match and insn->mask are 0.  */
@@ -2943,13 +3373,12 @@ riscv_ip (char *str, struct riscv_cl_insn *ip, expressionS *imm_expr,
              parse_extended_operand:
                oparg = opargStart;
                if (riscv_parse_extended_operands (ip, imm_expr, imm_reloc,
-                                                  &oparg, &asarg))
+                                                  &oparg, &asarg, &regno))
                  continue;
            }
          break;
        }
       asarg = asargStart;
-      error = _("illegal operands");
       insn_with_csr = false;
     }
 
@@ -3073,6 +3502,8 @@ enum options
   OPTION_MPRIV_SPEC,
   OPTION_BIG_ENDIAN,
   OPTION_LITTLE_ENDIAN,
+  OPTION_CHECK_CONSTRAINTS,
+  OPTION_NO_CHECK_CONSTRAINTS,
   OPTION_END_OF_ENUM
 };
 
@@ -3093,6 +3524,8 @@ struct option md_longopts[] =
   {"mpriv-spec", required_argument, NULL, OPTION_MPRIV_SPEC},
   {"mbig-endian", no_argument, NULL, OPTION_BIG_ENDIAN},
   {"mlittle-endian", no_argument, NULL, OPTION_LITTLE_ENDIAN},
+  {"mcheck-constraints", no_argument, NULL, OPTION_CHECK_CONSTRAINTS},
+  {"mno-check-constraints", no_argument, NULL, OPTION_NO_CHECK_CONSTRAINTS},
 
   {NULL, no_argument, NULL, 0}
 };
@@ -3177,6 +3610,14 @@ md_parse_option (int c, const char *arg)
       target_big_endian = 0;
       break;
 
+    case OPTION_CHECK_CONSTRAINTS:
+      riscv_opts.check_constraints = true;
+      break;
+
+    case OPTION_NO_CHECK_CONSTRAINTS:
+      riscv_opts.check_constraints = false;
+      break;
+
     default:
       return 0;
     }
@@ -3549,6 +3990,10 @@ s_riscv_option (int x ATTRIBUTE_UNUSED)
     riscv_opts.csr_check = true;
   else if (strcmp (name, "no-csr-check") == 0)
     riscv_opts.csr_check = false;
+  else if (strcmp (name, "checkconstraints") == 0)
+    riscv_opts.check_constraints = true;
+  else if (strcmp (name, "nocheckconstraints") == 0)
+    riscv_opts.check_constraints = false;
   else if (strcmp (name, "push") == 0)
     {
       struct riscv_option_stack *s;
diff --git a/gas/testsuite/gas/riscv/extended/extended-csr.d b/gas/testsuite/gas/riscv/extended/extended-csr.d
new file mode 100644 (file)
index 0000000..bfe102b
--- /dev/null
@@ -0,0 +1,16 @@
+#as: -march=rv32iv
+#objdump: -d
+
+.*:[   ]+file format .*
+
+
+Disassembly of section .text:
+
+0+000 <.text>:
+[      ]+[0-9a-f]+:[   ]+00802573[     ]+csrr[         ]+a0,vstart
+[      ]+[0-9a-f]+:[   ]+00902573[     ]+csrr[         ]+a0,vxsat
+[      ]+[0-9a-f]+:[   ]+00a02573[     ]+csrr[         ]+a0,vxrm
+[      ]+[0-9a-f]+:[   ]+00f02573[     ]+csrr[         ]+a0,vcsr
+[      ]+[0-9a-f]+:[   ]+c2002573[     ]+csrr[         ]+a0,vl
+[      ]+[0-9a-f]+:[   ]+c2102573[     ]+csrr[         ]+a0,vtype
+[      ]+[0-9a-f]+:[   ]+c2202573[     ]+csrr[         ]+a0,vlenb
diff --git a/gas/testsuite/gas/riscv/extended/extended-csr.s b/gas/testsuite/gas/riscv/extended/extended-csr.s
new file mode 100644 (file)
index 0000000..5826c12
--- /dev/null
@@ -0,0 +1,12 @@
+       .macro csr val
+       csrr a0,\val
+       .endm
+
+       # Vector
+       csr vstart
+       csr vxsat
+       csr vxrm
+       csr vcsr
+       csr vl
+       csr vtype
+       csr vlenb
index 8767c848a1fb2bea87af2949b6f6afd06a02c565..3966ef8783615f68a96fc5a38ca8e18913672465 100644 (file)
 # MA 02110-1301, USA.
 
 if [istarget riscv*-*-*] {
+    run_dump_tests "vector-insns"
+    run_dump_tests "vector-insns-vmsgtvx"
+    run_dump_tests "vector-insns-zero-imm"
+    run_dump_tests "vector-insns-fail-arith-fixp"
+    run_dump_tests "vector-insns-fail-arith-floatp"
+    run_dump_tests "vector-insns-fail-arith-int"
+    run_dump_tests "vector-insns-fail-arith-narrow"
+    run_dump_tests "vector-insns-fail-arith-widen"
+    run_dump_tests "vector-insns-fail-load-store"
+    run_dump_tests "vector-insns-fail-mask"
+    run_dump_tests "vector-insns-fail-permutation"
+    run_dump_tests "vector-insns-fail-zvamo"
+
+    run_dump_tests "extended-csr"
 }
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.d
new file mode 100644 (file)
index 0000000..df48418
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32iv -mcheck-constraints
+#source: vector-insns-fail-arith-fixp.s
+#error_output: vector-insns-fail-arith-fixp.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.l
new file mode 100644 (file)
index 0000000..3481174
--- /dev/null
@@ -0,0 +1,27 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vm `vsaddu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsaddu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsaddu.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsadd.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsadd.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssubu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssubu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssub.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vaaddu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vaaddu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vaadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vaadd.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vasubu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vasubu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vasub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vasub.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsmul.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsmul.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssrl.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssrl.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssrl.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssra.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssra.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssra.vi v0,v4,31,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-fixp.s
new file mode 100644 (file)
index 0000000..1fbcb60
--- /dev/null
@@ -0,0 +1,81 @@
+# Vector Single-Width Saturating Add and Subtract
+
+       vsaddu.vv v4, v4, v8            # OK
+       vsaddu.vv v8, v4, v8            # OK
+       vsaddu.vv v0, v4, v8, v0.t      # vd overlap vm
+       vsaddu.vx v4, v4, a1            # OK
+       vsaddu.vx v0, v4, a1, v0.t      # vd overlap vm
+       vsaddu.vi v4, v4, 15            # OK
+       vsaddu.vi v0, v4, 15, v0.t      # vd overlap vm
+
+       vsadd.vv v4, v4, v8
+       vsadd.vv v8, v4, v8
+       vsadd.vv v0, v4, v8, v0.t
+       vsadd.vx v4, v4, a1
+       vsadd.vx v0, v4, a1, v0.t
+       vsadd.vi v4, v4, 15
+       vsadd.vi v0, v4, 15, v0.t
+
+       vssubu.vv v4, v4, v8            # OK
+       vssubu.vv v8, v4, v8            # OK
+       vssubu.vv v0, v4, v8, v0.t      # vd overlap vm
+       vssubu.vx v4, v4, a1            # OK
+       vssubu.vx v0, v4, a1, v0.t      # vd overlap vm
+
+       vssub.vv v4, v4, v8
+       vssub.vv v8, v4, v8
+       vssub.vv v0, v4, v8, v0.t
+       vssub.vx v4, v4, a1
+       vssub.vx v0, v4, a1, v0.t
+
+# Vector Single-Width Averaging Add and Subtract
+
+       vaaddu.vv v4, v4, v8            # OK
+       vaaddu.vv v8, v4, v8            # OK
+       vaaddu.vv v0, v4, v8, v0.t      # vd overlap vm
+       vaaddu.vx v4, v4, a1            # OK
+       vaaddu.vx v0, v4, a1, v0.t      # vd overlap vm
+
+       vaadd.vv v4, v4, v8
+       vaadd.vv v8, v4, v8
+       vaadd.vv v0, v4, v8, v0.t
+       vaadd.vx v4, v4, a1
+       vaadd.vx v0, v4, a1, v0.t
+
+       vasubu.vv v4, v4, v8
+       vasubu.vv v8, v4, v8
+       vasubu.vv v0, v4, v8, v0.t
+       vasubu.vx v4, v4, a1
+       vasubu.vx v0, v4, a1, v0.t
+
+       vasub.vv v4, v4, v8
+       vasub.vv v8, v4, v8
+       vasub.vv v0, v4, v8, v0.t
+       vasub.vx v4, v4, a1
+       vasub.vx v0, v4, a1, v0.t
+
+# Vector Single-Width Fractional Multiply with Rounding and Saturation
+
+       vsmul.vv v4, v4, v8             # OK
+       vsmul.vv v8, v4, v8             # OK
+       vsmul.vv v0, v4, v8, v0.t       # vd overlap vm
+       vsmul.vx v4, v4, a1             # OK
+       vsmul.vx v0, v4, a1, v0.t       # vd overlap vm
+
+# Vector Single-Width Scaling Shift Instructions
+
+       vssrl.vv v4, v4, v8             # OK
+       vssrl.vv v8, v4, v8             # OK
+       vssrl.vv v0, v4, v8, v0.t       # vd overlap vm
+       vssrl.vx v4, v4, a1             # OK
+       vssrl.vx v0, v4, a1, v0.t       # vd overlap vm
+       vssrl.vi v4, v4, 31             # OK
+       vssrl.vi v0, v4, 31, v0.t       # vd overlap vm
+
+       vssra.vv v4, v4, v8
+       vssra.vv v8, v4, v8
+       vssra.vv v0, v4, v8, v0.t
+       vssra.vx v4, v4, a1
+       vssra.vx v0, v4, a1, v0.t
+       vssra.vi v4, v4, 31
+       vssra.vi v0, v4, 31, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.d
new file mode 100644 (file)
index 0000000..796f7e2
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32ifv -mcheck-constraints
+#source: vector-insns-fail-arith-floatp.s
+#error_output: vector-insns-fail-arith-floatp.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.l
new file mode 100644 (file)
index 0000000..bcc49a0
--- /dev/null
@@ -0,0 +1,48 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vm `vfadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfadd.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsub.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfrsub.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmul.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmul.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfdiv.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfdiv.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfrdiv.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmacc.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmacc.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmacc.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmacc.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmsac.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmsac.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmsac.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmsac.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmadd.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmadd.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmsub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmsub.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmsub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfnmsub.vf v0,fa1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsqrt.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfrece7.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfrsqrte7.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfclass.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmin.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmin.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmax.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfmax.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfneg.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnj.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnj.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnjn.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnjn.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnjx.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfsgnjx.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.xu.f.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.x.f.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.rtz.xu.f.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.rtz.x.f.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.f.xu.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfcvt.f.x.v v0,v4,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-floatp.s
new file mode 100644 (file)
index 0000000..a48b1a3
--- /dev/null
@@ -0,0 +1,155 @@
+# Vector Single-Width Floating-Point Add/Subtract Instructions
+
+       vfadd.vv v4, v4, v8             # OK
+       vfadd.vv v8, v4, v8             # OK
+       vfadd.vv v0, v4, v8, v0.t       # vd overlap vm
+       vfadd.vf v4, v4, fa1            # OK
+       vfadd.vf v0, v4, fa1, v0.t      # vd overlap vm
+
+       vfsub.vv v4, v4, v8
+       vfsub.vv v8, v4, v8
+       vfsub.vv v0, v4, v8, v0.t
+       vfsub.vf v4, v4, fa1
+       vfsub.vf v0, v4, fa1, v0.t
+
+       vfrsub.vf v4, v4, fa1           # OK
+       vfrsub.vf v0, v4, fa1, v0.t     # vd overlap vm
+
+# Vector Single-Width Floating-Point Multiply/Divide Instructions
+
+       vfmul.vv v4, v4, v8             # OK
+       vfmul.vv v8, v4, v8             # OK
+       vfmul.vv v0, v4, v8, v0.t       # vd overlap vm
+       vfmul.vf v4, v4, fa1            # OK
+       vfmul.vf v0, v4, fa1, v0.t      # vd overlap vm
+
+       vfdiv.vv v4, v4, v8
+       vfdiv.vv v8, v4, v8
+       vfdiv.vv v0, v4, v8, v0.t
+       vfdiv.vf v4, v4, fa1
+       vfdiv.vf v0, v4, fa1, v0.t
+
+       vfrdiv.vf v4, v4, fa1           # OK
+       vfrdiv.vf v0, v4, fa1, v0.t     # vd overlap vm
+
+# Vector Single-Width Floating-Point Fused Multiply-Add Instructions
+
+       vfmacc.vv v4, v4, v8            # OK
+       vfmacc.vv v8, v4, v8            # OK
+       vfmacc.vv v0, v4, v8, v0.t      # vd overlap vm
+       vfmacc.vf v4, fa1, v4           # OK
+       vfmacc.vf v0, fa1, v4, v0.t     # vd overlap vm
+
+       vfnmacc.vv v4, v4, v8
+       vfnmacc.vv v8, v4, v8
+       vfnmacc.vv v0, v4, v8, v0.t
+       vfnmacc.vf v4, fa1, v4
+       vfnmacc.vf v0, fa1, v4, v0.t
+
+       vfmsac.vv v4, v4, v8
+       vfmsac.vv v8, v4, v8
+       vfmsac.vv v0, v4, v8, v0.t
+       vfmsac.vf v4, fa1, v4
+       vfmsac.vf v0, fa1, v4, v0.t
+
+       vfnmsac.vv v4, v4, v8
+       vfnmsac.vv v8, v4, v8
+       vfnmsac.vv v0, v4, v8, v0.t
+       vfnmsac.vf v4, fa1, v4
+       vfnmsac.vf v0, fa1, v4, v0.t
+
+       vfmadd.vv v4, v4, v8
+       vfmadd.vv v8, v4, v8
+       vfmadd.vv v0, v4, v8, v0.t
+       vfmadd.vf v4, fa1, v4
+       vfmadd.vf v0, fa1, v4, v0.t
+
+       vfnmadd.vv v4, v4, v8
+       vfnmadd.vv v8, v4, v8
+       vfnmadd.vv v0, v4, v8, v0.t
+       vfnmadd.vf v4, fa1, v4
+       vfnmadd.vf v0, fa1, v4, v0.t
+
+       vfmsub.vv v4, v4, v8
+       vfmsub.vv v8, v4, v8
+       vfmsub.vv v0, v4, v8, v0.t
+       vfmsub.vf v4, fa1, v4
+       vfmsub.vf v0, fa1, v4, v0.t
+
+       vfnmsub.vv v4, v4, v8
+       vfnmsub.vv v8, v4, v8
+       vfnmsub.vv v0, v4, v8, v0.t
+       vfnmsub.vf v4, fa1, v4
+       vfnmsub.vf v0, fa1, v4, v0.t
+
+# Vector Floating-Point Square-Root Instruction
+
+       vfsqrt.v v4, v4                 # OK
+       vfsqrt.v v0, v4, v0.t           # vd overlap vm
+
+# Vector Floating-Point Reciprocal Estimate Instruction
+
+       vfrece7.v v4, v4                # OK
+       vfrece7.v v0, v4, v0.t          # vd overlap vm
+
+# Vector Floating-Point Reciprocal Square-Root Estimate Instruction
+
+       vfrsqrte7.v v4, v4              # OK
+       vfrsqrte7.v v0, v4, v0.t        # vd overlap vm
+
+# Vector Floating-Point Classify Instruction
+
+       vfclass.v v4, v4                # OK
+       vfclass.v v0, v4, v0.t          # vd overlap vm
+
+# Vector Floating-Point MIN/MAX Instructions
+
+       vfmin.vv v4, v4, v8             # OK
+       vfmin.vv v8, v4, v8             # OK
+       vfmin.vv v0, v4, v8, v0.t       # vd overlap vm
+       vfmin.vf v4, v4, fa1            # OK
+       vfmin.vf v0, v4, fa1, v0.t      # vd overlap vm
+
+       vfmax.vv v4, v4, v8
+       vfmax.vv v8, v4, v8
+       vfmax.vv v0, v4, v8, v0.t
+       vfmax.vf v4, v4, fa1
+       vfmax.vf v0, v4, fa1, v0.t
+
+# Vector Floating-Point Sign-Injection Instructions
+
+       vfneg.v v4, v4                  # OK
+       vfneg.v v0, v4, v0.t            # vd overlap vm
+
+       vfsgnj.vv v4, v4, v8            # OK
+       vfsgnj.vv v8, v4, v8            # OK
+       vfsgnj.vv v0, v4, v8, v0.t      # vd overlap vm
+       vfsgnj.vf v4, v4, fa1           # OK
+       vfsgnj.vf v0, v4, fa1, v0.t     # vd overlap vm
+
+       vfsgnjn.vv v4, v4, v8
+       vfsgnjn.vv v8, v4, v8
+       vfsgnjn.vv v0, v4, v8, v0.t
+       vfsgnjn.vf v4, v4, fa1
+       vfsgnjn.vf v0, v4, fa1, v0.t
+
+       vfsgnjx.vv v4, v4, v8
+       vfsgnjx.vv v8, v4, v8
+       vfsgnjx.vv v0, v4, v8, v0.t
+       vfsgnjx.vf v4, v4, fa1
+       vfsgnjx.vf v0, v4, fa1, v0.t
+
+# Single-Width Floating-Point/Integer Type-Convert Instructions
+
+       vfcvt.xu.f.v v4, v4             # OK
+       vfcvt.xu.f.v v0, v4, v0.t       # vd overlap vm
+       vfcvt.x.f.v v4, v4
+       vfcvt.x.f.v v0, v4, v0.t
+       vfcvt.rtz.xu.f.v v4, v4
+       vfcvt.rtz.xu.f.v v0, v4, v0.t
+       vfcvt.rtz.x.f.v v4, v4
+       vfcvt.rtz.x.f.v v0, v4, v0.t
+       vfcvt.f.xu.v v4, v4
+       vfcvt.f.xu.v v0, v4, v0.t
+       vfcvt.f.x.v v4, v4
+       vfcvt.f.x.v v0, v4, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.d
new file mode 100644 (file)
index 0000000..55b350b
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32iv -mcheck-constraints
+#source: vector-insns-fail-arith-int.s
+#error_output: vector-insns-fail-arith-int.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.l
new file mode 100644 (file)
index 0000000..5c9016d
--- /dev/null
@@ -0,0 +1,71 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vm `vneg.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vadd.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vadd.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsub.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vrsub.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vrsub.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vzext.vf2 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsext.vf2 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vzext.vf4 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsext.vf4 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vzext.vf8 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsext.vf8 v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vadc.vvm v0,v4,v8,v0'
+.*Error: illegal operands vd cannot overlap vm `vadc.vxm v0,v4,a1,v0'
+.*Error: illegal operands vd cannot overlap vm `vadc.vim v0,v4,15,v0'
+.*Error: illegal operands vd cannot overlap vm `vsbc.vvm v0,v4,v8,v0'
+.*Error: illegal operands vd cannot overlap vm `vsbc.vxm v0,v4,a1,v0'
+.*Error: illegal operands vd cannot overlap vm `vnot.v v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vand.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vand.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vand.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vor.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vor.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vor.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vxor.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vxor.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vxor.vi v0,v4,15,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsll.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsll.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsll.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsrl.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsrl.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsrl.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsra.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsra.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsra.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vminu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vminu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmin.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmin.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmaxu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmaxu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmax.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmax.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmul.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmul.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulh.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulh.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulhu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulhu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulhsu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmulhsu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vdivu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vdivu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vdiv.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vdiv.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vremu.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vremu.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vrem.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vrem.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmacc.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmacc.vx v0,a1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vnmsac.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vnmsac.vx v0,a1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmadd.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vmadd.vx v0,a1,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vnmsub.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vnmsub.vx v0,a1,v4,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-int.s
new file mode 100644 (file)
index 0000000..6ce4e42
--- /dev/null
@@ -0,0 +1,213 @@
+# Vector Single-Width Integer Add and Subtract
+
+       vneg.v v4, v4                   # OK
+       vneg.v v0, v4, v0.t             # vd overlap vm
+
+       vadd.vv v4, v4, v8              # OK
+       vadd.vv v8, v4, v8              # OK
+       vadd.vv v0, v4, v8, v0.t        # vd overlap vm
+       vadd.vx v4, v4, a1              # OK
+       vadd.vx v0, v4, a1, v0.t        # vd overlap vm
+       vadd.vi v4, v4, 15              # OK
+       vadd.vi v0, v4, 15, v0.t        # vd overlap vm
+
+       vsub.vv v4, v4, v8              # OK
+       vsub.vv v8, v4, v8              # OK
+       vsub.vv v0, v4, v8, v0.t        # vd overlap vm
+       vsub.vx v4, v4, a1              # OK
+       vsub.vx v0, v4, a1, v0.t        # vd overlap vm
+
+       vrsub.vx v4, v4, a1             # OK
+       vrsub.vx v0, v4, a1, v0.t       # vd overlap vm
+       vrsub.vi v4, v4, 15             # OK
+       vrsub.vi v0, v4, 15, v0.t       # vd overlap vm
+
+# Vector Integer Extension
+
+       vzext.vf2 v4, v4                # OK
+       vzext.vf2 v0, v4, v0.t          # vd overlap vm
+       vsext.vf2 v4, v4
+       vsext.vf2 v0, v4, v0.t
+       vzext.vf4 v4, v4
+       vzext.vf4 v0, v4, v0.t
+       vsext.vf4 v4, v4
+       vsext.vf4 v0, v4, v0.t
+       vzext.vf8 v4, v4
+       vzext.vf8 v0, v4, v0.t
+       vsext.vf8 v4, v4
+       vsext.vf8 v0, v4, v0.t
+
+# Vector Integer Add-with-Carry / Subtract-with-Borrow Instructions
+
+       vadc.vvm v4, v4, v8, v0         # OK
+       vadc.vvm v8, v4, v8, v0         # OK
+       vadc.vvm v0, v4, v8, v0         # vd overlap vm
+       vadc.vxm v4, v4, a1, v0         # OK
+       vadc.vxm v0, v4, a1, v0         # vd overlap vm
+       vadc.vim v4, v4, 15, v0         # OK
+       vadc.vim v0, v4, 15, v0         # vd overlap vm
+
+       vsbc.vvm v4, v4, v8, v0         # OK
+       vsbc.vvm v8, v4, v8, v0         # OK
+       vsbc.vvm v0, v4, v8, v0         # vd overlap vm
+       vsbc.vxm v4, v4, a1, v0         # OK
+       vsbc.vxm v0, v4, a1, v0         # vd overlap vm
+
+# Vector Bitwise Logical Instructions
+
+       vnot.v v4, v4                   # OK
+       vnot.v v0, v4, v0.t             # vd overlap vm
+
+       vand.vv v4, v4, v8              # OK
+       vand.vv v8, v4, v8              # OK
+       vand.vv v0, v4, v8, v0.t        # vd overlap vm
+       vand.vx v4, v4, a1              # OK
+       vand.vx v0, v4, a1, v0.t        # vd overlap vm
+       vand.vi v4, v4, 15              # OK
+       vand.vi v0, v4, 15, v0.t        # vd overlap vm
+
+       vor.vv  v4, v4, v8
+       vor.vv v8, v4, v8
+       vor.vv v0, v4, v8, v0.t
+       vor.vx v4, v4, a1
+       vor.vx v0, v4, a1, v0.t
+       vor.vi v4, v4, 15
+       vor.vi v0, v4, 15, v0.t
+
+       vxor.vv v4, v4, v8
+       vxor.vv v8, v4, v8
+       vxor.vv v0, v4, v8, v0.t
+       vxor.vx v4, v4, a1
+       vxor.vx v0, v4, a1, v0.t
+       vxor.vi v4, v4, 15
+       vxor.vi v0, v4, 15, v0.t
+
+# Vector Single-Width Bit Shift Instructions
+
+       vsll.vv v4, v4, v8              # OK
+       vsll.vv v8, v4, v8              # OK
+       vsll.vv v0, v4, v8, v0.t        # vd overlap vm
+       vsll.vx v4, v4, a1              # OK
+       vsll.vx v0, v4, a1, v0.t        # vd overlap vm
+       vsll.vi v4, v4, 31              # OK
+       vsll.vi v0, v4, 31, v0.t        # vd overlap vm
+
+       vsrl.vv v4, v4, v8
+       vsrl.vv v8, v4, v8
+       vsrl.vv v0, v4, v8, v0.t
+       vsrl.vx v4, v4, a1
+       vsrl.vx v0, v4, a1, v0.t
+       vsrl.vi v4, v4, 31
+       vsrl.vi v0, v4, 31, v0.t
+
+       vsra.vv v4, v4, v8
+       vsra.vv v8, v4, v8
+       vsra.vv v0, v4, v8, v0.t
+       vsra.vx v4, v4, a1
+       vsra.vx v0, v4, a1, v0.t
+       vsra.vi v4, v4, 31
+       vsra.vi v0, v4, 31, v0.t
+
+# Vector Integer Min/Max Instructions
+
+       vminu.vv v4, v4, v8             # OK
+       vminu.vv v8, v4, v8             # OK
+       vminu.vv v0, v4, v8, v0.t       # vd overlap vm
+       vminu.vx v4, v4, a1             # OK
+       vminu.vx v0, v4, a1, v0.t       # vd overlap vm
+
+       vmin.vv v4, v4, v8
+       vmin.vv v8, v4, v8
+       vmin.vv v0, v4, v8, v0.t
+       vmin.vx v4, v4, a1
+       vmin.vx v0, v4, a1, v0.t
+
+       vmaxu.vv v4, v4, v8
+       vmaxu.vv v8, v4, v8
+       vmaxu.vv v0, v4, v8, v0.t
+       vmaxu.vx v4, v4, a1
+       vmaxu.vx v0, v4, a1, v0.t
+
+       vmax.vv v4, v4, v8
+       vmax.vv v8, v4, v8
+       vmax.vv v0, v4, v8, v0.t
+       vmax.vx v4, v4, a1
+       vmax.vx v0, v4, a1, v0.t
+
+# Vector Single-Width Integer Multiply Instructions
+
+       vmul.vv v4, v4, v8              # OK
+       vmul.vv v8, v4, v8              # OK
+       vmul.vv v0, v4, v8, v0.t        # vd overlap vm
+       vmul.vx v4, v4, a1              # OK
+       vmul.vx v0, v4, a1, v0.t        # vd overlap vm
+
+       vmulh.vv v4, v4, v8
+       vmulh.vv v8, v4, v8
+       vmulh.vv v0, v4, v8, v0.t
+       vmulh.vx v4, v4, a1
+       vmulh.vx v0, v4, a1, v0.t
+
+       vmulhu.vv v4, v4, v8
+       vmulhu.vv v8, v4, v8
+       vmulhu.vv v0, v4, v8, v0.t
+       vmulhu.vx v4, v4, a1
+       vmulhu.vx v0, v4, a1, v0.t
+
+       vmulhsu.vv v4, v4, v8
+       vmulhsu.vv v8, v4, v8
+       vmulhsu.vv v0, v4, v8, v0.t
+       vmulhsu.vx v4, v4, a1
+       vmulhsu.vx v0, v4, a1, v0.t
+
+# Vector Integer Divide Instructions
+
+       vdivu.vv v4, v4, v8             # OK
+       vdivu.vv v8, v4, v8             # OK
+       vdivu.vv v0, v4, v8, v0.t       # vd overlap vm
+       vdivu.vx v4, v4, a1             # OK
+       vdivu.vx v0, v4, a1, v0.t       # vd overlap vm
+
+       vdiv.vv v4, v4, v8
+       vdiv.vv v8, v4, v8
+       vdiv.vv v0, v4, v8, v0.t
+       vdiv.vx v4, v4, a1
+       vdiv.vx v0, v4, a1, v0.t
+
+       vremu.vv v4, v4, v8
+       vremu.vv v8, v4, v8
+       vremu.vv v0, v4, v8, v0.t
+       vremu.vx v4, v4, a1
+       vremu.vx v0, v4, a1, v0.t
+
+       vrem.vv v4, v4, v8
+       vrem.vv v8, v4, v8
+       vrem.vv v0, v4, v8, v0.t
+       vrem.vx v4, v4, a1
+       vrem.vx v0, v4, a1, v0.t
+
+# Vector Single-Width Integer Multiply-Add Instructions
+
+       vmacc.vv v4, v4, v8             # OK
+       vmacc.vv v8, v4, v8             # OK
+       vmacc.vv v0, v4, v8, v0.t       # vd overlap vm
+       vmacc.vx v4, a1, v4             # OK
+       vmacc.vx v0, a1, v4, v0.t       # vd overlap vm
+
+       vnmsac.vv v4, v4, v8
+       vnmsac.vv v8, v4, v8
+       vnmsac.vv v0, v4, v8, v0.t
+       vnmsac.vx v4, a1, v4
+       vnmsac.vx v0, a1, v4, v0.t
+
+       vmadd.vv v4, v4, v8
+       vmadd.vv v8, v4, v8
+       vmadd.vv v0, v4, v8, v0.t
+       vmadd.vx v4, a1, v4
+       vmadd.vx v0, a1, v4, v0.t
+
+       vnmsub.vv v4, v4, v8
+       vnmsub.vv v8, v4, v8
+       vnmsub.vv v0, v4, v8, v0.t
+       vnmsub.vx v4, a1, v4
+       vnmsub.vx v0, a1, v4, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.d
new file mode 100644 (file)
index 0000000..e7a4d4e
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32ifv -mcheck-constraints
+#source: vector-insns-fail-arith-narrow.s
+#error_output: vector-insns-fail-arith-narrow.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.l
new file mode 100644 (file)
index 0000000..3a3634c
--- /dev/null
@@ -0,0 +1,85 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vs2 `vncvt.x.x.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vncvt.x.x.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vncvt.x.x.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vncvt.x.x.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wv v2,v2,v4'
+.*Error: illegal operands vd must be multiple of 2 `vnsrl.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wv v3,v2,v4'
+.*Error: illegal operands vd cannot overlap vm `vnsrl.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wx v2,v2,a1'
+.*Error: illegal operands vd must be multiple of 2 `vnsrl.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wx v3,v2,a1'
+.*Error: illegal operands vd cannot overlap vm `vnsrl.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wi v2,v2,31'
+.*Error: illegal operands vd must be multiple of 2 `vnsrl.wi v2,v3,31'
+.*Error: illegal operands vd cannot overlap vs2 `vnsrl.wi v3,v2,31'
+.*Error: illegal operands vd cannot overlap vm `vnsrl.wi v0,v2,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wv v2,v2,v4'
+.*Error: illegal operands vd must be multiple of 2 `vnsra.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wv v3,v2,v4'
+.*Error: illegal operands vd cannot overlap vm `vnsra.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wx v2,v2,a1'
+.*Error: illegal operands vd must be multiple of 2 `vnsra.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wx v3,v2,a1'
+.*Error: illegal operands vd cannot overlap vm `vnsra.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wi v2,v2,31'
+.*Error: illegal operands vd must be multiple of 2 `vnsra.wi v2,v3,31'
+.*Error: illegal operands vd cannot overlap vs2 `vnsra.wi v3,v2,31'
+.*Error: illegal operands vd cannot overlap vm `vnsra.wi v0,v2,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wv v2,v2,v4'
+.*Error: illegal operands vd must be multiple of 2 `vnclipu.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wv v3,v2,v4'
+.*Error: illegal operands vd cannot overlap vm `vnclipu.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wx v2,v2,a1'
+.*Error: illegal operands vd must be multiple of 2 `vnclipu.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wx v3,v2,a1'
+.*Error: illegal operands vd cannot overlap vm `vnclipu.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wi v2,v2,31'
+.*Error: illegal operands vd must be multiple of 2 `vnclipu.wi v2,v3,31'
+.*Error: illegal operands vd cannot overlap vs2 `vnclipu.wi v3,v2,31'
+.*Error: illegal operands vd cannot overlap vm `vnclipu.wi v0,v2,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wv v2,v2,v4'
+.*Error: illegal operands vd must be multiple of 2 `vnclip.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wv v3,v2,v4'
+.*Error: illegal operands vd cannot overlap vm `vnclip.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wx v2,v2,a1'
+.*Error: illegal operands vd must be multiple of 2 `vnclip.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wx v3,v2,a1'
+.*Error: illegal operands vd cannot overlap vm `vnclip.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wi v2,v2,31'
+.*Error: illegal operands vd must be multiple of 2 `vnclip.wi v2,v3,31'
+.*Error: illegal operands vd cannot overlap vs2 `vnclip.wi v3,v2,31'
+.*Error: illegal operands vd cannot overlap vm `vnclip.wi v0,v2,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.xu.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.xu.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.xu.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.xu.f.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.x.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.x.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.x.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.x.f.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.xu.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.rtz.xu.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.xu.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.rtz.xu.f.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.x.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.rtz.x.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rtz.x.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.rtz.x.f.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.xu.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.xu.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.xu.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.f.xu.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.x.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.x.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.x.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.f.x.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.f.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.f.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.f.f.w v0,v2,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rod.f.f.w v2,v2'
+.*Error: illegal operands vd must be multiple of 2 `vfncvt.rod.f.f.w v2,v3'
+.*Error: illegal operands vd cannot overlap vs2 `vfncvt.rod.f.f.w v3,v2'
+.*Error: illegal operands vd cannot overlap vm `vfncvt.rod.f.f.w v0,v2,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-narrow.s
new file mode 100644 (file)
index 0000000..73b96ef
--- /dev/null
@@ -0,0 +1,100 @@
+# Vector Narrowing Integer Right Shift Instructions
+
+       # vncvt.x.x.w vd,vs,vm = vnsrl.wx vd,vs,x0,vm
+       vncvt.x.x.w v2, v2              # vd overlap vs2
+       vncvt.x.x.w v2, v3              # vs2 should be multiple of 2
+       vncvt.x.x.w v3, v2              # vd overlap vs2
+       vncvt.x.x.w v0, v2, v0.t        # vd overlap vm
+
+       vnsrl.wv v2, v2, v4             # vd overlap vs2
+       vnsrl.wv v2, v3, v4             # vs2 should be multiple of 2
+       vnsrl.wv v3, v2, v4             # vd overlap vs2
+       vnsrl.wv v4, v2, v4             # OK
+       vnsrl.wv v0, v2, v4, v0.t       # vd overlap vm
+       vnsrl.wx v2, v2, a1             # vd overlap vs2
+       vnsrl.wx v2, v3, a1             # vs2 should be multiple of 2
+       vnsrl.wx v3, v2, a1             # vd overlap vs2
+       vnsrl.wx v0, v2, a1, v0.t       # vd overlap vm
+       vnsrl.wi v2, v2, 31             # vd overlap vs2
+       vnsrl.wi v2, v3, 31             # vs2 should be multiple of 2
+       vnsrl.wi v3, v2, 31             # vd overlap vs2
+       vnsrl.wi v0, v2, 31, v0.t       # vd overlap vm
+
+       vnsra.wv v2, v2, v4
+       vnsra.wv v2, v3, v4
+       vnsra.wv v3, v2, v4
+       vnsra.wv v4, v2, v4
+       vnsra.wv v0, v2, v4, v0.t
+       vnsra.wx v2, v2, a1
+       vnsra.wx v2, v3, a1
+       vnsra.wx v3, v2, a1
+       vnsra.wx v0, v2, a1, v0.t
+       vnsra.wi v2, v2, 31
+       vnsra.wi v2, v3, 31
+       vnsra.wi v3, v2, 31
+       vnsra.wi v0, v2, 31, v0.t
+
+# Vector Narrowing Fixed-Point Clip Instructions
+
+       vnclipu.wv v2, v2, v4           # vd overlap vs2
+       vnclipu.wv v2, v3, v4           # vs2 should be multiple of 2
+       vnclipu.wv v3, v2, v4           # vd overlap vs2
+       vnclipu.wv v4, v2, v4           # OK
+       vnclipu.wv v0, v2, v4, v0.t     # vd overlap vm
+       vnclipu.wx v2, v2, a1           # vd overlap vs2
+       vnclipu.wx v2, v3, a1           # vs2 should be multiple of 2
+       vnclipu.wx v3, v2, a1           # vd overlap vs2
+       vnclipu.wx v0, v2, a1, v0.t     # vd overlap vm
+       vnclipu.wi v2, v2, 31           # vd overlap vs2
+       vnclipu.wi v2, v3, 31           # vs2 should be multiple of 2
+       vnclipu.wi v3, v2, 31           # vd overlap vs2
+       vnclipu.wi v0, v2, 31, v0.t     # vd overlap vm
+
+       vnclip.wv v2, v2, v4
+       vnclip.wv v2, v3, v4
+       vnclip.wv v3, v2, v4
+       vnclip.wv v4, v2, v4
+       vnclip.wv v0, v2, v4, v0.t
+       vnclip.wx v2, v2, a1
+       vnclip.wx v2, v3, a1
+       vnclip.wx v3, v2, a1
+       vnclip.wx v0, v2, a1, v0.t
+       vnclip.wi v2, v2, 31
+       vnclip.wi v2, v3, 31
+       vnclip.wi v3, v2, 31
+       vnclip.wi v0, v2, 31, v0.t
+
+# Narrowing Floating-Point/Integer Type-Convert Instructions
+
+       vfncvt.xu.f.w v2, v2            # vd overlap vs2
+       vfncvt.xu.f.w v2, v3            # vs2 should be multiple of 2
+       vfncvt.xu.f.w v3, v2            # vd overlap vs2
+       vfncvt.xu.f.w v0, v2, v0.t      # vd overlap vm
+       vfncvt.x.f.w v2, v2
+       vfncvt.x.f.w v2, v3
+       vfncvt.x.f.w v3, v2
+       vfncvt.x.f.w v0, v2, v0.t
+       vfncvt.rtz.xu.f.w v2, v2
+       vfncvt.rtz.xu.f.w v2, v3
+       vfncvt.rtz.xu.f.w v3, v2
+       vfncvt.rtz.xu.f.w v0, v2, v0.t
+       vfncvt.rtz.x.f.w v2, v2
+       vfncvt.rtz.x.f.w v2, v3
+       vfncvt.rtz.x.f.w v3, v2
+       vfncvt.rtz.x.f.w v0, v2, v0.t
+       vfncvt.f.xu.w v2, v2
+       vfncvt.f.xu.w v2, v3
+       vfncvt.f.xu.w v3, v2
+       vfncvt.f.xu.w v0, v2, v0.t
+       vfncvt.f.x.w v2, v2
+       vfncvt.f.x.w v2, v3
+       vfncvt.f.x.w v3, v2
+       vfncvt.f.x.w v0, v2, v0.t
+       vfncvt.f.f.w v2, v2
+       vfncvt.f.f.w v2, v3
+       vfncvt.f.f.w v3, v2
+       vfncvt.f.f.w v0, v2, v0.t
+       vfncvt.rod.f.f.w v2, v2
+       vfncvt.rod.f.f.w v2, v3
+       vfncvt.rod.f.f.w v3, v2
+       vfncvt.rod.f.f.w v0, v2, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.d
new file mode 100644 (file)
index 0000000..e5f0348
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32ifv -mcheck-constraints
+#source: vector-insns-fail-arith-widen.s
+#error_output: vector-insns-fail-arith-widen.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.l
new file mode 100644 (file)
index 0000000..5f22ca9
--- /dev/null
@@ -0,0 +1,253 @@
+.*: Assembler messages:
+.*Error: illegal operands vd must be multiple of 2 `vwcvtu.x.x.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwcvtu.x.x.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwcvtu.x.x.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vwcvtu.x.x.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwcvt.x.x.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwcvt.x.x.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwcvt.x.x.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vwcvt.x.x.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwaddu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwaddu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwaddu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwaddu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwaddu.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwaddu.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwaddu.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwaddu.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vwaddu.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwaddu.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwaddu.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwaddu.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwaddu.wx v1,v2,a1'
+.*Error: illegal operands vs2 must be multiple of 2 `vwaddu.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwaddu.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsubu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsubu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsubu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwsubu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsubu.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwsubu.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwsubu.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsubu.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vwsubu.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsubu.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsubu.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwsubu.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsubu.wx v1,v2,a1'
+.*Error: illegal operands vs2 must be multiple of 2 `vwsubu.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwsubu.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwadd.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwadd.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwadd.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwadd.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwadd.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwadd.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwadd.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwadd.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwadd.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwadd.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwadd.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vwadd.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwadd.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwadd.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwadd.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwadd.wx v1,v2,a1'
+.*Error: illegal operands vs2 must be multiple of 2 `vwadd.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwadd.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsub.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwsub.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwsub.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsub.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsub.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwsub.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsub.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwsub.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwsub.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwsub.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsub.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vwsub.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsub.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwsub.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwsub.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwsub.wx v1,v2,a1'
+.*Error: illegal operands vs2 must be multiple of 2 `vwsub.wx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwsub.wx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmul.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmul.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmul.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmul.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmul.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmul.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmul.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmul.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmul.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwmul.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmulu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmulu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmulu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmulu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmulu.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulu.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwmulu.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmulsu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmulsu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmulsu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmulsu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmulsu.vx v1,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vx v2,v2,a1'
+.*Error: illegal operands vd cannot overlap vs2 `vwmulsu.vx v2,v3,a1'
+.*Error: illegal operands vd cannot overlap vm `vwmulsu.vx v0,v2,a1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmaccu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmaccu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmaccu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmaccu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmaccu.vx v1,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vx v2,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccu.vx v2,a1,v3'
+.*Error: illegal operands vd cannot overlap vm `vwmaccu.vx v0,a1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmacc.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmacc.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmacc.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmacc.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmacc.vx v1,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vx v2,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmacc.vx v2,a1,v3'
+.*Error: illegal operands vd cannot overlap vm `vwmacc.vx v0,a1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmaccsu.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmaccsu.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vwmaccsu.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vwmaccsu.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmaccsu.vx v1,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vx v2,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccsu.vx v2,a1,v3'
+.*Error: illegal operands vd cannot overlap vm `vwmaccsu.vx v0,a1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vwmaccus.vx v1,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccus.vx v2,a1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vwmaccus.vx v2,a1,v3'
+.*Error: illegal operands vd cannot overlap vm `vwmaccus.vx v0,a1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwadd.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwadd.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwadd.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwadd.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwadd.vf v1,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vf v2,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwadd.vf v2,v3,fa1'
+.*Error: illegal operands vd cannot overlap vm `vfwadd.vf v0,v2,fa1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwadd.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vfwadd.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwadd.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwadd.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwadd.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwsub.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwsub.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwsub.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwsub.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwsub.vf v1,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v3,fa1'
+.*Error: illegal operands vd cannot overlap vm `vfwsub.vf v0,v2,fa1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwsub.wv v1,v2,v4'
+.*Error: illegal operands vs2 must be multiple of 2 `vfwsub.wv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwsub.wv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwsub.wv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwsub.wv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwmul.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmul.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmul.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmul.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmul.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwmul.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwsub.vf v1,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v2,fa1'
+.*Error: illegal operands vd cannot overlap vs2 `vfwsub.vf v2,v3,fa1'
+.*Error: illegal operands vd cannot overlap vm `vfwsub.vf v0,v2,fa1,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwmacc.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmacc.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmacc.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwmacc.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwmacc.vf v1,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vf v2,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmacc.vf v2,fa1,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwmacc.vf v0,fa1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwnmacc.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwnmacc.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwnmacc.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwnmacc.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwnmacc.vf v1,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vf v2,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmacc.vf v2,fa1,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwnmacc.vf v0,fa1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwmsac.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmsac.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwmsac.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwmsac.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwmsac.vf v1,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vf v2,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwmsac.vf v2,fa1,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwmsac.vf v0,fa1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwnmsac.vv v1,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwnmsac.vv v2,v2,v4'
+.*Error: illegal operands vd cannot overlap vs1 `vfwnmsac.vv v2,v3,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vv v4,v2,v4'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vv v4,v2,v5'
+.*Error: illegal operands vd cannot overlap vm `vfwnmsac.vv v0,v2,v4,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwnmsac.vf v1,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vf v2,fa1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwnmsac.vf v2,fa1,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwnmsac.vf v0,fa1,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.xu.f.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.xu.f.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.xu.f.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.xu.f.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.x.f.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.x.f.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.x.f.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.x.f.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.rtz.xu.f.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.xu.f.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.xu.f.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.rtz.xu.f.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.rtz.x.f.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.x.f.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.rtz.x.f.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.rtz.x.f.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.xu.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.xu.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.xu.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.xu.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.x.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.x.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.x.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.x.v v0,v2,v0.t'
+.*Error: illegal operands vd must be multiple of 2 `vfwcvt.f.f.v v1,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.f.v v2,v2'
+.*Error: illegal operands vd cannot overlap vs2 `vfwcvt.f.f.v v2,v3'
+.*Error: illegal operands vd cannot overlap vm `vfwcvt.f.f.v v0,v2,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-arith-widen.s
new file mode 100644 (file)
index 0000000..addedd4
--- /dev/null
@@ -0,0 +1,297 @@
+# Vector Widening Integer Add/Subtract
+
+       # vwcvtu.x.x.v vd,vs,vm = vwaddu.vx vd,vs,x0,vm
+       vwcvtu.x.x.v v1, v2             # vd should be multiple of 2
+       vwcvtu.x.x.v v2, v2             # vd overlap vs2
+       vwcvtu.x.x.v v2, v3             # vd overlap vs2
+       vwcvtu.x.x.v v0, v2, v0.t       # vd overlap vm
+
+       # vwcvt.x.x.v vd,vs,vm = vwadd.vx vd,vs,x0,vm
+       vwcvt.x.x.v v1, v2
+       vwcvt.x.x.v v2, v2
+       vwcvt.x.x.v v2, v3
+       vwcvt.x.x.v v0, v2, v0.t
+
+       vwaddu.vv v1, v2, v4            # vd should be multiple of 2
+       vwaddu.vv v2, v2, v4            # vd overlap vs2
+       vwaddu.vv v2, v3, v4            # vd overlap vs2
+       vwaddu.vv v4, v2, v4            # vd overlap vs1
+       vwaddu.vv v4, v2, v5            # vd overlap vs1
+       vwaddu.vv v0, v2, v4, v0.t      # vd overlap vm
+       vwaddu.vx v1, v2, a1            # vd should be multiple of 2
+       vwaddu.vx v2, v2, a1            # vd overlap vs2
+       vwaddu.vx v2, v3, a1            # vd overlap vs2
+       vwaddu.vx v0, v2, a1, v0.t      # vd overlap vm
+       vwaddu.wv v1, v2, v4            # vd should be multiple of 2
+       vwaddu.wv v2, v2, v4            # OK
+       vwaddu.wv v2, v3, v4            # vs2 should be multiple of 2
+       vwaddu.wv v4, v2, v4            # vd overlap vs1
+       vwaddu.wv v4, v2, v5            # vd overlap vs1
+       vwaddu.wv v0, v2, v4, v0.t      # vd overlap vm
+       vwaddu.wx v1, v2, a1            # vd should be multiple of 2
+       vwaddu.wx v2, v2, a1            # OK
+       vwaddu.wx v2, v3, a1            # vs2 should be multiple of 2
+       vwaddu.wx v0, v2, a1, v0.t      # vd overlap vm
+
+       vwsubu.vv v1, v2, v4
+       vwsubu.vv v2, v2, v4
+       vwsubu.vv v2, v3, v4
+       vwsubu.vv v4, v2, v4
+       vwsubu.vv v4, v2, v5
+       vwsubu.vv v0, v2, v4, v0.t
+       vwsubu.vx v1, v2, a1
+       vwsubu.vx v2, v2, a1
+       vwsubu.vx v2, v3, a1
+       vwsubu.vx v0, v2, a1, v0.t
+       vwsubu.wv v1, v2, v4
+       vwsubu.wv v2, v2, v4
+       vwsubu.wv v2, v3, v4
+       vwsubu.wv v4, v2, v4
+       vwsubu.wv v4, v2, v5
+       vwsubu.wv v0, v2, v4, v0.t
+       vwsubu.wx v1, v2, a1
+       vwsubu.wx v2, v2, a1
+       vwsubu.wx v2, v3, a1
+       vwsubu.wx v0, v2, a1, v0.t
+
+       vwadd.vv v1, v2, v4
+       vwadd.vv v2, v2, v4
+       vwadd.vv v2, v3, v4
+       vwadd.vv v4, v2, v4
+       vwadd.vv v4, v2, v5
+       vwadd.vv v0, v2, v4, v0.t
+       vwadd.vx v1, v2, a1
+       vwadd.vx v2, v2, a1
+       vwadd.vx v2, v3, a1
+       vwadd.vx v0, v2, a1, v0.t
+       vwadd.wv v1, v2, v4
+       vwadd.wv v2, v2, v4
+       vwadd.wv v2, v3, v4
+       vwadd.wv v4, v2, v4
+       vwadd.wv v4, v2, v5
+       vwadd.wv v0, v2, v4, v0.t
+       vwadd.wx v1, v2, a1
+       vwadd.wx v2, v2, a1
+       vwadd.wx v2, v3, a1
+       vwadd.wx v0, v2, a1, v0.t
+
+       vwsub.vv v1, v2, v4
+       vwsub.vv v2, v2, v4
+       vwsub.vv v2, v3, v4
+       vwsub.vv v4, v2, v4
+       vwsub.vv v4, v2, v5
+       vwsub.vv v0, v2, v4, v0.t
+       vwsub.vx v1, v2, a1
+       vwsub.vx v2, v2, a1
+       vwsub.vx v2, v3, a1
+       vwsub.vx v0, v2, a1, v0.t
+       vwsub.wv v1, v2, v4
+       vwsub.wv v2, v2, v4
+       vwsub.wv v2, v3, v4
+       vwsub.wv v4, v2, v4
+       vwsub.wv v4, v2, v5
+       vwsub.wv v0, v2, v4, v0.t
+       vwsub.wx v1, v2, a1
+       vwsub.wx v2, v2, a1
+       vwsub.wx v2, v3, a1
+       vwsub.wx v0, v2, a1, v0.t
+
+# Vector Widening Integer Multiply Instructions
+
+       vwmul.vv v1, v2, v4             # vd should be multiple of 2
+       vwmul.vv v2, v2, v4             # vd overlap vs2
+       vwmul.vv v2, v3, v4             # vd overlap vs2
+       vwmul.vv v4, v2, v4             # vd overlap vs1
+       vwmul.vv v4, v2, v5             # vd overlap vs1
+       vwmul.vv v0, v2, v4, v0.t       # vd overlap vm
+       vwmul.vx v1, v2, a1             # vd should be multiple of 2
+       vwmul.vx v2, v2, a1             # vd overlap vs2
+       vwmul.vx v2, v3, a1             # vd overlap vs2
+       vwmul.vx v0, v2, a1, v0.t       # vd overlap vm
+
+       vwmulu.vv v1, v2, v4
+       vwmulu.vv v2, v2, v4
+       vwmulu.vv v2, v3, v4
+       vwmulu.vv v4, v2, v4
+       vwmulu.vv v4, v2, v5
+       vwmulu.vv v0, v2, v4, v0.t
+       vwmulu.vx v1, v2, a1
+       vwmulu.vx v2, v2, a1
+       vwmulu.vx v2, v3, a1
+       vwmulu.vx v0, v2, a1, v0.t
+
+       vwmulsu.vv v1, v2, v4
+       vwmulsu.vv v2, v2, v4
+       vwmulsu.vv v2, v3, v4
+       vwmulsu.vv v4, v2, v4
+       vwmulsu.vv v4, v2, v5
+       vwmulsu.vv v0, v2, v4, v0.t
+       vwmulsu.vx v1, v2, a1
+       vwmulsu.vx v2, v2, a1
+       vwmulsu.vx v2, v3, a1
+       vwmulsu.vx v0, v2, a1, v0.t
+
+# Vector Widening Integer Multiply-Add Instructions
+
+       vwmaccu.vv v1, v2, v4           # vd should be multiple of 2
+       vwmaccu.vv v2, v2, v4           # vd overlap vs1
+       vwmaccu.vv v2, v3, v4           # vd overlap vs1
+       vwmaccu.vv v4, v2, v4           # vd overlap vs2
+       vwmaccu.vv v4, v2, v5           # vd overlap vs2
+       vwmaccu.vv v0, v2, v4, v0.t     # vd overlap vm
+       vwmaccu.vx v1, a1, v2           # vd should be multiple of 2
+       vwmaccu.vx v2, a1, v2           # vd overlap vs2
+       vwmaccu.vx v2, a1, v3           # vd overlap vs2
+       vwmaccu.vx v0, a1, v2, v0.t     # vd overlap vm
+
+       vwmacc.vv v1, v2, v4
+       vwmacc.vv v2, v2, v4
+       vwmacc.vv v2, v3, v4
+       vwmacc.vv v4, v2, v4
+       vwmacc.vv v4, v2, v5
+       vwmacc.vv v0, v2, v4, v0.t
+       vwmacc.vx v1, a1, v2
+       vwmacc.vx v2, a1, v2
+       vwmacc.vx v2, a1, v3
+       vwmacc.vx v0, a1, v2, v0.t
+
+       vwmaccsu.vv v1, v2, v4
+       vwmaccsu.vv v2, v2, v4
+       vwmaccsu.vv v2, v3, v4
+       vwmaccsu.vv v4, v2, v4
+       vwmaccsu.vv v4, v2, v5
+       vwmaccsu.vv v0, v2, v4, v0.t
+       vwmaccsu.vx v1, a1, v2
+       vwmaccsu.vx v2, a1, v2
+       vwmaccsu.vx v2, a1, v3
+       vwmaccsu.vx v0, a1, v2, v0.t
+
+       vwmaccus.vx v1, a1, v2          # vd should be multiple of 2
+       vwmaccus.vx v2, a1, v2          # vd overlap vs2
+       vwmaccus.vx v2, a1, v3          # vd overlap vs2
+       vwmaccus.vx v0, a1, v2, v0.t    # vd overlap vm
+
+# Vector Widening Floating-Point Add/Subtract Instructions
+
+       vfwadd.vv v1, v2, v4            # vd should be multiple of 2
+       vfwadd.vv v2, v2, v4            # vd overlap vs2
+       vfwadd.vv v2, v3, v4            # vd overlap vs2
+       vfwadd.vv v4, v2, v4            # vd overlap vs1
+       vfwadd.vv v4, v2, v5            # vd overlap vs1
+       vfwadd.vv v0, v2, v4, v0.t      # vd overlap vm
+       vfwadd.vf v1, v2, fa1           # vd should be multiple of 2
+       vfwadd.vf v2, v2, fa1           # vd overlap vs2
+       vfwadd.vf v2, v3, fa1           # vd overlap vs2
+       vfwadd.vf v0, v2, fa1, v0.t     # vd overlap vm
+       vfwadd.wv v1, v2, v4            # vd should be multiple of 2
+       vfwadd.wv v2, v2, v4            # OK
+       vfwadd.wv v2, v3, v4            # vs2 should be multiple of 2
+       vfwadd.wv v4, v2, v4            # vd overlap vs1
+       vfwadd.wv v4, v2, v5            # vd overlap vs1
+       vfwadd.wv v0, v2, v4, v0.t      # vd overlap vm
+
+       vfwsub.vv v1, v2, v4
+       vfwsub.vv v2, v2, v4
+       vfwsub.vv v2, v3, v4
+       vfwsub.vv v4, v2, v4
+       vfwsub.vv v4, v2, v5
+       vfwsub.vv v0, v2, v4, v0.t
+       vfwsub.vf v1, v2, fa1
+       vfwsub.vf v2, v2, fa1
+       vfwsub.vf v2, v3, fa1
+       vfwsub.vf v0, v2, fa1, v0.t
+       vfwsub.wv v1, v2, v4
+       vfwsub.wv v2, v2, v4
+       vfwsub.wv v2, v3, v4
+       vfwsub.wv v4, v2, v4
+       vfwsub.wv v4, v2, v5
+       vfwsub.wv v0, v2, v4, v0.t
+
+# Vector Widening Floating-Point Multiply
+
+       vfwmul.vv v1, v2, v4            # vd should be multiple of 2
+       vfwmul.vv v2, v2, v4            # vd overlap vs2
+       vfwmul.vv v2, v3, v4            # vd overlap vs2
+       vfwmul.vv v4, v2, v4            # vd overlap vs1
+       vfwmul.vv v4, v2, v5            # vd overlap vs1
+       vfwmul.vv v0, v2, v4, v0.t      # vd overlap vm
+       vfwsub.vf v1, v2, fa1           # vd should be multiple of 2
+       vfwsub.vf v2, v2, fa1           # vd overlap vs2
+       vfwsub.vf v2, v3, fa1           # vd overlap vs2
+       vfwsub.vf v0, v2, fa1, v0.t     # vd overlap vm
+
+# Vector Widening Floating-Point Fused Multiply-Add Instructions
+       vfwmacc.vv v1, v2, v4           # vd should be multiple of 2
+       vfwmacc.vv v2, v2, v4           # vd overlap vs1
+       vfwmacc.vv v2, v3, v4           # vd overlap vs1
+       vfwmacc.vv v4, v2, v4           # vd overlap vs2
+       vfwmacc.vv v4, v2, v5           # vd overlap vs2
+       vfwmacc.vv v0, v2, v4, v0.t     # vd overlap vm
+       vfwmacc.vf v1, fa1, v2          # vd should be multiple of 2
+       vfwmacc.vf v2, fa1, v2          # vd overlap vs2
+       vfwmacc.vf v2, fa1, v3          # vd overlap vs2
+       vfwmacc.vf v0, fa1, v2, v0.t    # vd overlap vm
+
+       vfwnmacc.vv v1, v2, v4
+       vfwnmacc.vv v2, v2, v4
+       vfwnmacc.vv v2, v3, v4
+       vfwnmacc.vv v4, v2, v4
+       vfwnmacc.vv v4, v2, v5
+       vfwnmacc.vv v0, v2, v4, v0.t
+       vfwnmacc.vf v1, fa1, v2
+       vfwnmacc.vf v2, fa1, v2
+       vfwnmacc.vf v2, fa1, v3
+       vfwnmacc.vf v0, fa1, v2, v0.t
+
+       vfwmsac.vv v1, v2, v4
+       vfwmsac.vv v2, v2, v4
+       vfwmsac.vv v2, v3, v4
+       vfwmsac.vv v4, v2, v4
+       vfwmsac.vv v4, v2, v5
+       vfwmsac.vv v0, v2, v4, v0.t
+       vfwmsac.vf v1, fa1, v2
+       vfwmsac.vf v2, fa1, v2
+       vfwmsac.vf v2, fa1, v3
+       vfwmsac.vf v0, fa1, v2, v0.t
+
+       vfwnmsac.vv v1, v2, v4
+       vfwnmsac.vv v2, v2, v4
+       vfwnmsac.vv v2, v3, v4
+       vfwnmsac.vv v4, v2, v4
+       vfwnmsac.vv v4, v2, v5
+       vfwnmsac.vv v0, v2, v4, v0.t
+       vfwnmsac.vf v1, fa1, v2
+       vfwnmsac.vf v2, fa1, v2
+       vfwnmsac.vf v2, fa1, v3
+       vfwnmsac.vf v0, fa1, v2, v0.t
+
+# Widening Floating-Point/Integer Type-Convert Instructions
+
+       vfwcvt.xu.f.v v1, v2            # vd should be multiple of 2
+       vfwcvt.xu.f.v v2, v2            # vd overlap vs2
+       vfwcvt.xu.f.v v2, v3            # vd overlap vs2
+       vfwcvt.xu.f.v v0, v2, v0.t      # vd overlap vm
+       vfwcvt.x.f.v v1, v2
+       vfwcvt.x.f.v v2, v2
+       vfwcvt.x.f.v v2, v3
+       vfwcvt.x.f.v v0, v2, v0.t
+       vfwcvt.rtz.xu.f.v v1, v2
+       vfwcvt.rtz.xu.f.v v2, v2
+       vfwcvt.rtz.xu.f.v v2, v3
+       vfwcvt.rtz.xu.f.v v0, v2, v0.t
+       vfwcvt.rtz.x.f.v v1, v2
+       vfwcvt.rtz.x.f.v v2, v2
+       vfwcvt.rtz.x.f.v v2, v3
+       vfwcvt.rtz.x.f.v v0, v2, v0.t
+       vfwcvt.f.xu.v v1, v2
+       vfwcvt.f.xu.v v2, v2
+       vfwcvt.f.xu.v v2, v3
+       vfwcvt.f.xu.v v0, v2, v0.t
+       vfwcvt.f.x.v v1, v2
+       vfwcvt.f.x.v v2, v2
+       vfwcvt.f.x.v v2, v3
+       vfwcvt.f.x.v v0, v2, v0.t
+       vfwcvt.f.f.v v1, v2
+       vfwcvt.f.f.v v2, v2
+       vfwcvt.f.f.v v2, v3
+       vfwcvt.f.f.v v0, v2, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.d
new file mode 100644 (file)
index 0000000..763191f
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32iv -mcheck-constraints
+#source: vector-insns-fail-load-store.s
+#error_output: vector-insns-fail-load-store.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.l
new file mode 100644 (file)
index 0000000..9ef99a5
--- /dev/null
@@ -0,0 +1,419 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vm `vle8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vle64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vse8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vse16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vse32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vse64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlse8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlse16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlse32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlse64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsse8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsse16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsse32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsse64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vloxei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vloxei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vloxei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vloxei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsoxei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsoxei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsoxei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsoxei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vluxei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vluxei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vluxei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vluxei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsuxei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsuxei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsuxei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsuxei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg2e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg3e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg4e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg5e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg6e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg7e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg8e8.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e8ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg2e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg3e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg4e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg5e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg6e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg7e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg8e16.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e16ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg2e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg3e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg4e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg5e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg6e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg7e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg8e32.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e32ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg2e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg2e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg3e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg3e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg4e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg4e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg5e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg5e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg6e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg6e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg7e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg7e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vsseg8e64.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlseg8e64ff.v v0,\(a0\),v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg2e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg2e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg3e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg3e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg4e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg4e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg5e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg5e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg6e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg6e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg7e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg7e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg8e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg8e8.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg2e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg2e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg3e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg3e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg4e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg4e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg5e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg5e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg6e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg6e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg7e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg7e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg8e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg8e16.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg2e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg2e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg3e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg3e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg4e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg4e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg5e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg5e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg6e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg6e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg7e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg7e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg8e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg8e32.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg2e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg2e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg3e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg3e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg4e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg4e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg5e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg5e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg6e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg6e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg7e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg7e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vlsseg8e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vssseg8e64.v v0,\(a0\),a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg2ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg2ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg2ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg2ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg3ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg3ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg3ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg3ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg4ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg4ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg4ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg4ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg5ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg5ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg5ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg5ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg6ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg6ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg6ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg6ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg7ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg7ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg7ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg7ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg8ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg8ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg8ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg8ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg2ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg2ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg2ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg2ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg3ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg3ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg3ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg3ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg4ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg4ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg4ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg4ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg5ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg5ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg5ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg5ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg6ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg6ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg6ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg6ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg7ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg7ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg7ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg7ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg8ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg8ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg8ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg8ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg2ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg2ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg2ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg2ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg3ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg3ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg3ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg3ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg4ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg4ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg4ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg4ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg5ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg5ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg5ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg5ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg6ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg6ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg6ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg6ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg7ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg7ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg7ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg7ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg8ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg8ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg8ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg8ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg2ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg2ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg2ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg2ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg3ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg3ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg3ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg3ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg4ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg4ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg4ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg4ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg5ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg5ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg5ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg5ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg6ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg6ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg6ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg6ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg7ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg7ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg7ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg7ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vloxseg8ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vloxseg8ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsoxseg8ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsoxseg8ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg2ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg2ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg2ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg2ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg3ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg3ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg3ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg3ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg4ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg4ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg4ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg4ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg5ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg5ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg5ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg5ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg6ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg6ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg6ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg6ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg7ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg7ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg7ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg7ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg8ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg8ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg8ei8.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg8ei8.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg2ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg2ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg2ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg2ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg3ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg3ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg3ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg3ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg4ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg4ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg4ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg4ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg5ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg5ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg5ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg5ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg6ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg6ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg6ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg6ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg7ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg7ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg7ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg7ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg8ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg8ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg8ei16.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg8ei16.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg2ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg2ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg2ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg2ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg3ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg3ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg3ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg3ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg4ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg4ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg4ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg4ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg5ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg5ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg5ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg5ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg6ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg6ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg6ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg6ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg7ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg7ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg7ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg7ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg8ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg8ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg8ei32.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg8ei32.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg2ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg2ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg2ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg2ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg3ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg3ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg3ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg3ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg4ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg4ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg4ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg4ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg5ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg5ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg5ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg5ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg6ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg6ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg6ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg6ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg7ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg7ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg7ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg7ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vluxseg8ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vluxseg8ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vsuxseg8ei64.v v4,\(a0\),v4'
+.*Error: illegal operands vd cannot overlap vm `vsuxseg8ei64.v v0,\(a0\),v4,v0.t'
+.*Error: illegal operands vd must be multiple of nf `vl2r.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl2re8.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl2re16.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl2re32.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl2re64.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl4r.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl4re8.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl4re16.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl4re32.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl4re64.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl8r.v v26,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl8re8.v v26,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl8re16.v v26,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl8re32.v v26,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vl8re64.v v26,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vs2r.v v31,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vs4r.v v30,\(a0\)'
+.*Error: illegal operands vd must be multiple of nf `vs8r.v v26,\(a0\)'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-load-store.s
new file mode 100644 (file)
index 0000000..274d0b4
--- /dev/null
@@ -0,0 +1,481 @@
+# Vector Unit-Stride Loads and Stores
+
+       vle8.v v0, (a0), v0.t           # vd overlap vm
+       vle8ff.v v0, (a0), v0.t         # vd overlap vm
+       vle16.v v0, (a0), v0.t
+       vle16ff.v v0, (a0), v0.t
+       vle32.v v0, (a0), v0.t
+       vle32ff.v v0, (a0), v0.t
+       vle64.v v0, (a0), v0.t
+       vle64ff.v v0, (a0), v0.t
+
+       vse8.v v0, (a0), v0.t           # vd overlap vm
+       vse16.v v0, (a0), v0.t
+       vse32.v v0, (a0), v0.t
+       vse64.v v0, (a0), v0.t
+
+# Vector Strided Loads and Stores
+
+       vlse8.v v0, (a0), a1, v0.t      # vd overlap vm
+       vlse16.v v0, (a0), a1, v0.t
+       vlse32.v v0, (a0), a1, v0.t
+       vlse64.v v0, (a0), a1, v0.t
+
+       vsse8.v v0, (a0), a1, v0.t
+       vsse16.v v0, (a0), a1, v0.t
+       vsse32.v v0, (a0), a1, v0.t
+       vsse64.v v0, (a0), a1, v0.t
+
+# Vector Ordered Indexed Loads and Stores
+
+       vloxei8.v v4, (a0), v4          # OK
+       vloxei8.v v0, (a0), v4, v0.t    # vd overlap vm
+       vloxei16.v v4, (a0), v4
+       vloxei16.v v0, (a0), v4, v0.t
+       vloxei32.v v4, (a0), v4
+       vloxei32.v v0, (a0), v4, v0.t
+       vloxei64.v v4, (a0), v4
+       vloxei64.v v0, (a0), v4, v0.t
+
+       vsoxei8.v v4, (a0), v4
+       vsoxei8.v v0, (a0), v4, v0.t
+       vsoxei16.v v4, (a0), v4
+       vsoxei16.v v0, (a0), v4, v0.t
+       vsoxei32.v v4, (a0), v4
+       vsoxei32.v v0, (a0), v4, v0.t
+       vsoxei64.v v4, (a0), v4
+       vsoxei64.v v0, (a0), v4, v0.t
+
+# Vector Unordered Indexed Loads and Stores
+
+       vluxei8.v v4, (a0), v4          # OK
+       vluxei8.v v0, (a0), v4, v0.t    # vd overlap vm
+       vluxei16.v v4, (a0), v4
+       vluxei16.v v0, (a0), v4, v0.t
+       vluxei32.v v4, (a0), v4
+       vluxei32.v v0, (a0), v4, v0.t
+       vluxei64.v v4, (a0), v4
+       vluxei64.v v0, (a0), v4, v0.t
+
+       vsuxei8.v v4, (a0), v4
+       vsuxei8.v v0, (a0), v4, v0.t
+       vsuxei16.v v4, (a0), v4
+       vsuxei16.v v0, (a0), v4, v0.t
+       vsuxei32.v v4, (a0), v4
+       vsuxei32.v v0, (a0), v4, v0.t
+       vsuxei64.v v4, (a0), v4
+       vsuxei64.v v0, (a0), v4, v0.t
+
+# Vector Unit-Stride Segment Loads and Stores
+
+       vlseg2e8.v v0, (a0), v0.t       # vd overlap vm
+       vsseg2e8.v v0, (a0), v0.t       # vd overlap vm
+       vlseg2e8ff.v v0, (a0), v0.t     # vd overlap vm
+       vlseg3e8.v v0, (a0), v0.t
+       vsseg3e8.v v0, (a0), v0.t
+       vlseg3e8ff.v v0, (a0), v0.t
+       vlseg4e8.v v0, (a0), v0.t
+       vsseg4e8.v v0, (a0), v0.t
+       vlseg4e8ff.v v0, (a0), v0.t
+       vlseg5e8.v v0, (a0), v0.t
+       vsseg5e8.v v0, (a0), v0.t
+       vlseg5e8ff.v v0, (a0), v0.t
+       vlseg6e8.v v0, (a0), v0.t
+       vsseg6e8.v v0, (a0), v0.t
+       vlseg6e8ff.v v0, (a0), v0.t
+       vlseg7e8.v v0, (a0), v0.t
+       vsseg7e8.v v0, (a0), v0.t
+       vlseg7e8ff.v v0, (a0), v0.t
+       vlseg8e8.v v0, (a0), v0.t
+       vsseg8e8.v v0, (a0), v0.t
+       vlseg8e8ff.v v0, (a0), v0.t
+
+       vlseg2e16.v v0, (a0), v0.t
+       vsseg2e16.v v0, (a0), v0.t
+       vlseg2e16ff.v v0, (a0), v0.t
+       vlseg3e16.v v0, (a0), v0.t
+       vsseg3e16.v v0, (a0), v0.t
+       vlseg3e16ff.v v0, (a0), v0.t
+       vlseg4e16.v v0, (a0), v0.t
+       vsseg4e16.v v0, (a0), v0.t
+       vlseg4e16ff.v v0, (a0), v0.t
+       vlseg5e16.v v0, (a0), v0.t
+       vsseg5e16.v v0, (a0), v0.t
+       vlseg5e16ff.v v0, (a0), v0.t
+       vlseg6e16.v v0, (a0), v0.t
+       vsseg6e16.v v0, (a0), v0.t
+       vlseg6e16ff.v v0, (a0), v0.t
+       vlseg7e16.v v0, (a0), v0.t
+       vsseg7e16.v v0, (a0), v0.t
+       vlseg7e16ff.v v0, (a0), v0.t
+       vlseg8e16.v v0, (a0), v0.t
+       vsseg8e16.v v0, (a0), v0.t
+       vlseg8e16ff.v v0, (a0), v0.t
+
+       vlseg2e32.v v0, (a0), v0.t
+       vsseg2e32.v v0, (a0), v0.t
+       vlseg2e32ff.v v0, (a0), v0.t
+       vlseg3e32.v v0, (a0), v0.t
+       vsseg3e32.v v0, (a0), v0.t
+       vlseg3e32ff.v v0, (a0), v0.t
+       vlseg4e32.v v0, (a0), v0.t
+       vsseg4e32.v v0, (a0), v0.t
+       vlseg4e32ff.v v0, (a0), v0.t
+       vlseg5e32.v v0, (a0), v0.t
+       vsseg5e32.v v0, (a0), v0.t
+       vlseg5e32ff.v v0, (a0), v0.t
+       vlseg6e32.v v0, (a0), v0.t
+       vsseg6e32.v v0, (a0), v0.t
+       vlseg6e32ff.v v0, (a0), v0.t
+       vlseg7e32.v v0, (a0), v0.t
+       vsseg7e32.v v0, (a0), v0.t
+       vlseg7e32ff.v v0, (a0), v0.t
+       vlseg8e32.v v0, (a0), v0.t
+       vsseg8e32.v v0, (a0), v0.t
+       vlseg8e32ff.v v0, (a0), v0.t
+
+       vlseg2e64.v v0, (a0), v0.t
+       vsseg2e64.v v0, (a0), v0.t
+       vlseg2e64ff.v v0, (a0), v0.t
+       vlseg3e64.v v0, (a0), v0.t
+       vsseg3e64.v v0, (a0), v0.t
+       vlseg3e64ff.v v0, (a0), v0.t
+       vlseg4e64.v v0, (a0), v0.t
+       vsseg4e64.v v0, (a0), v0.t
+       vlseg4e64ff.v v0, (a0), v0.t
+       vlseg5e64.v v0, (a0), v0.t
+       vsseg5e64.v v0, (a0), v0.t
+       vlseg5e64ff.v v0, (a0), v0.t
+       vlseg6e64.v v0, (a0), v0.t
+       vsseg6e64.v v0, (a0), v0.t
+       vlseg6e64ff.v v0, (a0), v0.t
+       vlseg7e64.v v0, (a0), v0.t
+       vsseg7e64.v v0, (a0), v0.t
+       vlseg7e64ff.v v0, (a0), v0.t
+       vlseg8e64.v v0, (a0), v0.t
+       vsseg8e64.v v0, (a0), v0.t
+       vlseg8e64ff.v v0, (a0), v0.t
+
+# Vector Strided Segment Loads and Stores
+
+       vlsseg2e8.v v0, (a0), a1, v0.t          # vd overlap vm
+       vssseg2e8.v v0, (a0), a1, v0.t          # vd overlap vm
+       vlsseg3e8.v v0, (a0), a1, v0.t
+       vssseg3e8.v v0, (a0), a1, v0.t
+       vlsseg4e8.v v0, (a0), a1, v0.t
+       vssseg4e8.v v0, (a0), a1, v0.t
+       vlsseg5e8.v v0, (a0), a1, v0.t
+       vssseg5e8.v v0, (a0), a1, v0.t
+       vlsseg6e8.v v0, (a0), a1, v0.t
+       vssseg6e8.v v0, (a0), a1, v0.t
+       vlsseg7e8.v v0, (a0), a1, v0.t
+       vssseg7e8.v v0, (a0), a1, v0.t
+       vlsseg8e8.v v0, (a0), a1, v0.t
+       vssseg8e8.v v0, (a0), a1, v0.t
+
+       vlsseg2e16.v v0, (a0), a1, v0.t
+       vssseg2e16.v v0, (a0), a1, v0.t
+       vlsseg3e16.v v0, (a0), a1, v0.t
+       vssseg3e16.v v0, (a0), a1, v0.t
+       vlsseg4e16.v v0, (a0), a1, v0.t
+       vssseg4e16.v v0, (a0), a1, v0.t
+       vlsseg5e16.v v0, (a0), a1, v0.t
+       vssseg5e16.v v0, (a0), a1, v0.t
+       vlsseg6e16.v v0, (a0), a1, v0.t
+       vssseg6e16.v v0, (a0), a1, v0.t
+       vlsseg7e16.v v0, (a0), a1, v0.t
+       vssseg7e16.v v0, (a0), a1, v0.t
+       vlsseg8e16.v v0, (a0), a1, v0.t
+       vssseg8e16.v v0, (a0), a1, v0.t
+
+       vlsseg2e32.v v0, (a0), a1, v0.t
+       vssseg2e32.v v0, (a0), a1, v0.t
+       vlsseg3e32.v v0, (a0), a1, v0.t
+       vssseg3e32.v v0, (a0), a1, v0.t
+       vlsseg4e32.v v0, (a0), a1, v0.t
+       vssseg4e32.v v0, (a0), a1, v0.t
+       vlsseg5e32.v v0, (a0), a1, v0.t
+       vssseg5e32.v v0, (a0), a1, v0.t
+       vlsseg6e32.v v0, (a0), a1, v0.t
+       vssseg6e32.v v0, (a0), a1, v0.t
+       vlsseg7e32.v v0, (a0), a1, v0.t
+       vssseg7e32.v v0, (a0), a1, v0.t
+       vlsseg8e32.v v0, (a0), a1, v0.t
+       vssseg8e32.v v0, (a0), a1, v0.t
+
+       vlsseg2e64.v v0, (a0), a1, v0.t
+       vssseg2e64.v v0, (a0), a1, v0.t
+       vlsseg3e64.v v0, (a0), a1, v0.t
+       vssseg3e64.v v0, (a0), a1, v0.t
+       vlsseg4e64.v v0, (a0), a1, v0.t
+       vssseg4e64.v v0, (a0), a1, v0.t
+       vlsseg5e64.v v0, (a0), a1, v0.t
+       vssseg5e64.v v0, (a0), a1, v0.t
+       vlsseg6e64.v v0, (a0), a1, v0.t
+       vssseg6e64.v v0, (a0), a1, v0.t
+       vlsseg7e64.v v0, (a0), a1, v0.t
+       vssseg7e64.v v0, (a0), a1, v0.t
+       vlsseg8e64.v v0, (a0), a1, v0.t
+       vssseg8e64.v v0, (a0), a1, v0.t
+
+# Vector Ordered Indexed Segment Loads and Stores
+
+       vloxseg2ei8.v v4, (a0), v4              # vd overlap vs2
+       vloxseg2ei8.v v0, (a0), v4, v0.t        # vd overlap vm
+       vsoxseg2ei8.v v4, (a0), v4              # vd overlap vs2
+       vsoxseg2ei8.v v0, (a0), v4, v0.t        # vd overlap vm
+       vloxseg3ei8.v v4, (a0), v4
+       vloxseg3ei8.v v0, (a0), v4, v0.t
+       vsoxseg3ei8.v v4, (a0), v4
+       vsoxseg3ei8.v v0, (a0), v4, v0.t
+       vloxseg4ei8.v v4, (a0), v4
+       vloxseg4ei8.v v0, (a0), v4, v0.t
+       vsoxseg4ei8.v v4, (a0), v4
+       vsoxseg4ei8.v v0, (a0), v4, v0.t
+       vloxseg5ei8.v v4, (a0), v4
+       vloxseg5ei8.v v0, (a0), v4, v0.t
+       vsoxseg5ei8.v v4, (a0), v4
+       vsoxseg5ei8.v v0, (a0), v4, v0.t
+       vloxseg6ei8.v v4, (a0), v4
+       vloxseg6ei8.v v0, (a0), v4, v0.t
+       vsoxseg6ei8.v v4, (a0), v4
+       vsoxseg6ei8.v v0, (a0), v4, v0.t
+       vloxseg7ei8.v v4, (a0), v4
+       vloxseg7ei8.v v0, (a0), v4, v0.t
+       vsoxseg7ei8.v v4, (a0), v4
+       vsoxseg7ei8.v v0, (a0), v4, v0.t
+       vloxseg8ei8.v v4, (a0), v4
+       vloxseg8ei8.v v0, (a0), v4, v0.t
+       vsoxseg8ei8.v v4, (a0), v4
+       vsoxseg8ei8.v v0, (a0), v4, v0.t
+
+       vloxseg2ei16.v v4, (a0), v4
+       vloxseg2ei16.v v0, (a0), v4, v0.t
+       vsoxseg2ei16.v v4, (a0), v4
+       vsoxseg2ei16.v v0, (a0), v4, v0.t
+       vloxseg3ei16.v v4, (a0), v4
+       vloxseg3ei16.v v0, (a0), v4, v0.t
+       vsoxseg3ei16.v v4, (a0), v4
+       vsoxseg3ei16.v v0, (a0), v4, v0.t
+       vloxseg4ei16.v v4, (a0), v4
+       vloxseg4ei16.v v0, (a0), v4, v0.t
+       vsoxseg4ei16.v v4, (a0), v4
+       vsoxseg4ei16.v v0, (a0), v4, v0.t
+       vloxseg5ei16.v v4, (a0), v4
+       vloxseg5ei16.v v0, (a0), v4, v0.t
+       vsoxseg5ei16.v v4, (a0), v4
+       vsoxseg5ei16.v v0, (a0), v4, v0.t
+       vloxseg6ei16.v v4, (a0), v4
+       vloxseg6ei16.v v0, (a0), v4, v0.t
+       vsoxseg6ei16.v v4, (a0), v4
+       vsoxseg6ei16.v v0, (a0), v4, v0.t
+       vloxseg7ei16.v v4, (a0), v4
+       vloxseg7ei16.v v0, (a0), v4, v0.t
+       vsoxseg7ei16.v v4, (a0), v4
+       vsoxseg7ei16.v v0, (a0), v4, v0.t
+       vloxseg8ei16.v v4, (a0), v4
+       vloxseg8ei16.v v0, (a0), v4, v0.t
+       vsoxseg8ei16.v v4, (a0), v4
+       vsoxseg8ei16.v v0, (a0), v4, v0.t
+
+       vloxseg2ei32.v v4, (a0), v4
+       vloxseg2ei32.v v0, (a0), v4, v0.t
+       vsoxseg2ei32.v v4, (a0), v4
+       vsoxseg2ei32.v v0, (a0), v4, v0.t
+       vloxseg3ei32.v v4, (a0), v4
+       vloxseg3ei32.v v0, (a0), v4, v0.t
+       vsoxseg3ei32.v v4, (a0), v4
+       vsoxseg3ei32.v v0, (a0), v4, v0.t
+       vloxseg4ei32.v v4, (a0), v4
+       vloxseg4ei32.v v0, (a0), v4, v0.t
+       vsoxseg4ei32.v v4, (a0), v4
+       vsoxseg4ei32.v v0, (a0), v4, v0.t
+       vloxseg5ei32.v v4, (a0), v4
+       vloxseg5ei32.v v0, (a0), v4, v0.t
+       vsoxseg5ei32.v v4, (a0), v4
+       vsoxseg5ei32.v v0, (a0), v4, v0.t
+       vloxseg6ei32.v v4, (a0), v4
+       vloxseg6ei32.v v0, (a0), v4, v0.t
+       vsoxseg6ei32.v v4, (a0), v4
+       vsoxseg6ei32.v v0, (a0), v4, v0.t
+       vloxseg7ei32.v v4, (a0), v4
+       vloxseg7ei32.v v0, (a0), v4, v0.t
+       vsoxseg7ei32.v v4, (a0), v4
+       vsoxseg7ei32.v v0, (a0), v4, v0.t
+       vloxseg8ei32.v v4, (a0), v4
+       vloxseg8ei32.v v0, (a0), v4, v0.t
+       vsoxseg8ei32.v v4, (a0), v4
+       vsoxseg8ei32.v v0, (a0), v4, v0.t
+
+       vloxseg2ei64.v v4, (a0), v4
+       vloxseg2ei64.v v0, (a0), v4, v0.t
+       vsoxseg2ei64.v v4, (a0), v4
+       vsoxseg2ei64.v v0, (a0), v4, v0.t
+       vloxseg3ei64.v v4, (a0), v4
+       vloxseg3ei64.v v0, (a0), v4, v0.t
+       vsoxseg3ei64.v v4, (a0), v4
+       vsoxseg3ei64.v v0, (a0), v4, v0.t
+       vloxseg4ei64.v v4, (a0), v4
+       vloxseg4ei64.v v0, (a0), v4, v0.t
+       vsoxseg4ei64.v v4, (a0), v4
+       vsoxseg4ei64.v v0, (a0), v4, v0.t
+       vloxseg5ei64.v v4, (a0), v4
+       vloxseg5ei64.v v0, (a0), v4, v0.t
+       vsoxseg5ei64.v v4, (a0), v4
+       vsoxseg5ei64.v v0, (a0), v4, v0.t
+       vloxseg6ei64.v v4, (a0), v4
+       vloxseg6ei64.v v0, (a0), v4, v0.t
+       vsoxseg6ei64.v v4, (a0), v4
+       vsoxseg6ei64.v v0, (a0), v4, v0.t
+       vloxseg7ei64.v v4, (a0), v4
+       vloxseg7ei64.v v0, (a0), v4, v0.t
+       vsoxseg7ei64.v v4, (a0), v4
+       vsoxseg7ei64.v v0, (a0), v4, v0.t
+       vloxseg8ei64.v v4, (a0), v4
+       vloxseg8ei64.v v0, (a0), v4, v0.t
+       vsoxseg8ei64.v v4, (a0), v4
+       vsoxseg8ei64.v v0, (a0), v4, v0.t
+
+# Vector Unordered Indexed Segment Loads and Stores
+
+       vluxseg2ei8.v v4, (a0), v4              # vd overlap vs2
+       vluxseg2ei8.v v0, (a0), v4, v0.t        # vd overlap vm
+       vsuxseg2ei8.v v4, (a0), v4              # vd overlap vs2
+       vsuxseg2ei8.v v0, (a0), v4, v0.t        # vd overlap vm
+       vluxseg3ei8.v v4, (a0), v4
+       vluxseg3ei8.v v0, (a0), v4, v0.t
+       vsuxseg3ei8.v v4, (a0), v4
+       vsuxseg3ei8.v v0, (a0), v4, v0.t
+       vluxseg4ei8.v v4, (a0), v4
+       vluxseg4ei8.v v0, (a0), v4, v0.t
+       vsuxseg4ei8.v v4, (a0), v4
+       vsuxseg4ei8.v v0, (a0), v4, v0.t
+       vluxseg5ei8.v v4, (a0), v4
+       vluxseg5ei8.v v0, (a0), v4, v0.t
+       vsuxseg5ei8.v v4, (a0), v4
+       vsuxseg5ei8.v v0, (a0), v4, v0.t
+       vluxseg6ei8.v v4, (a0), v4
+       vluxseg6ei8.v v0, (a0), v4, v0.t
+       vsuxseg6ei8.v v4, (a0), v4
+       vsuxseg6ei8.v v0, (a0), v4, v0.t
+       vluxseg7ei8.v v4, (a0), v4
+       vluxseg7ei8.v v0, (a0), v4, v0.t
+       vsuxseg7ei8.v v4, (a0), v4
+       vsuxseg7ei8.v v0, (a0), v4, v0.t
+       vluxseg8ei8.v v4, (a0), v4
+       vluxseg8ei8.v v0, (a0), v4, v0.t
+       vsuxseg8ei8.v v4, (a0), v4
+       vsuxseg8ei8.v v0, (a0), v4, v0.t
+
+       vluxseg2ei16.v v4, (a0), v4
+       vluxseg2ei16.v v0, (a0), v4, v0.t
+       vsuxseg2ei16.v v4, (a0), v4
+       vsuxseg2ei16.v v0, (a0), v4, v0.t
+       vluxseg3ei16.v v4, (a0), v4
+       vluxseg3ei16.v v0, (a0), v4, v0.t
+       vsuxseg3ei16.v v4, (a0), v4
+       vsuxseg3ei16.v v0, (a0), v4, v0.t
+       vluxseg4ei16.v v4, (a0), v4
+       vluxseg4ei16.v v0, (a0), v4, v0.t
+       vsuxseg4ei16.v v4, (a0), v4
+       vsuxseg4ei16.v v0, (a0), v4, v0.t
+       vluxseg5ei16.v v4, (a0), v4
+       vluxseg5ei16.v v0, (a0), v4, v0.t
+       vsuxseg5ei16.v v4, (a0), v4
+       vsuxseg5ei16.v v0, (a0), v4, v0.t
+       vluxseg6ei16.v v4, (a0), v4
+       vluxseg6ei16.v v0, (a0), v4, v0.t
+       vsuxseg6ei16.v v4, (a0), v4
+       vsuxseg6ei16.v v0, (a0), v4, v0.t
+       vluxseg7ei16.v v4, (a0), v4
+       vluxseg7ei16.v v0, (a0), v4, v0.t
+       vsuxseg7ei16.v v4, (a0), v4
+       vsuxseg7ei16.v v0, (a0), v4, v0.t
+       vluxseg8ei16.v v4, (a0), v4
+       vluxseg8ei16.v v0, (a0), v4, v0.t
+       vsuxseg8ei16.v v4, (a0), v4
+       vsuxseg8ei16.v v0, (a0), v4, v0.t
+
+       vluxseg2ei32.v v4, (a0), v4
+       vluxseg2ei32.v v0, (a0), v4, v0.t
+       vsuxseg2ei32.v v4, (a0), v4
+       vsuxseg2ei32.v v0, (a0), v4, v0.t
+       vluxseg3ei32.v v4, (a0), v4
+       vluxseg3ei32.v v0, (a0), v4, v0.t
+       vsuxseg3ei32.v v4, (a0), v4
+       vsuxseg3ei32.v v0, (a0), v4, v0.t
+       vluxseg4ei32.v v4, (a0), v4
+       vluxseg4ei32.v v0, (a0), v4, v0.t
+       vsuxseg4ei32.v v4, (a0), v4
+       vsuxseg4ei32.v v0, (a0), v4, v0.t
+       vluxseg5ei32.v v4, (a0), v4
+       vluxseg5ei32.v v0, (a0), v4, v0.t
+       vsuxseg5ei32.v v4, (a0), v4
+       vsuxseg5ei32.v v0, (a0), v4, v0.t
+       vluxseg6ei32.v v4, (a0), v4
+       vluxseg6ei32.v v0, (a0), v4, v0.t
+       vsuxseg6ei32.v v4, (a0), v4
+       vsuxseg6ei32.v v0, (a0), v4, v0.t
+       vluxseg7ei32.v v4, (a0), v4
+       vluxseg7ei32.v v0, (a0), v4, v0.t
+       vsuxseg7ei32.v v4, (a0), v4
+       vsuxseg7ei32.v v0, (a0), v4, v0.t
+       vluxseg8ei32.v v4, (a0), v4
+       vluxseg8ei32.v v0, (a0), v4, v0.t
+       vsuxseg8ei32.v v4, (a0), v4
+       vsuxseg8ei32.v v0, (a0), v4, v0.t
+
+       vluxseg2ei64.v v4, (a0), v4
+       vluxseg2ei64.v v0, (a0), v4, v0.t
+       vsuxseg2ei64.v v4, (a0), v4
+       vsuxseg2ei64.v v0, (a0), v4, v0.t
+       vluxseg3ei64.v v4, (a0), v4
+       vluxseg3ei64.v v0, (a0), v4, v0.t
+       vsuxseg3ei64.v v4, (a0), v4
+       vsuxseg3ei64.v v0, (a0), v4, v0.t
+       vluxseg4ei64.v v4, (a0), v4
+       vluxseg4ei64.v v0, (a0), v4, v0.t
+       vsuxseg4ei64.v v4, (a0), v4
+       vsuxseg4ei64.v v0, (a0), v4, v0.t
+       vluxseg5ei64.v v4, (a0), v4
+       vluxseg5ei64.v v0, (a0), v4, v0.t
+       vsuxseg5ei64.v v4, (a0), v4
+       vsuxseg5ei64.v v0, (a0), v4, v0.t
+       vluxseg6ei64.v v4, (a0), v4
+       vluxseg6ei64.v v0, (a0), v4, v0.t
+       vsuxseg6ei64.v v4, (a0), v4
+       vsuxseg6ei64.v v0, (a0), v4, v0.t
+       vluxseg7ei64.v v4, (a0), v4
+       vluxseg7ei64.v v0, (a0), v4, v0.t
+       vsuxseg7ei64.v v4, (a0), v4
+       vsuxseg7ei64.v v0, (a0), v4, v0.t
+       vluxseg8ei64.v v4, (a0), v4
+       vluxseg8ei64.v v0, (a0), v4, v0.t
+       vsuxseg8ei64.v v4, (a0), v4
+       vsuxseg8ei64.v v0, (a0), v4, v0.t
+
+# Vector Load/Store Whole Register Instructions
+
+       vl1r.v v31, (a0)                # OK
+
+       vl2r.v v31, (a0)                # vd must be aligned to 2
+       vl2re8.v v31, (a0)
+       vl2re16.v v31, (a0)
+       vl2re32.v v31, (a0)
+       vl2re64.v v31, (a0)
+
+       vl4r.v v30, (a0)                # vd must be aligned to 4
+       vl4re8.v v30, (a0)
+       vl4re16.v v30, (a0)
+       vl4re32.v v30, (a0)
+       vl4re64.v v30, (a0)
+
+       vl8r.v v26, (a0)                # vd must be aligned to 8
+       vl8re8.v v26, (a0)
+       vl8re16.v v26, (a0)
+       vl8re32.v v26, (a0)
+       vl8re64.v v26, (a0)
+
+       vs2r.v v31, (a0)                # vs3 must be aligned to 2
+       vs4r.v v30, (a0)                # vs3 must be aligned to 4
+       vs8r.v v26, (a0)                # vs3 must be aligned to 8
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.d
new file mode 100644 (file)
index 0000000..35f9e2c
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32iv -mcheck-constraints
+#source: vector-insns-fail-mask.s
+#error_output: vector-insns-fail-mask.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.l
new file mode 100644 (file)
index 0000000..7ff5a3c
--- /dev/null
@@ -0,0 +1,10 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vs2 `vmsbf.m v4,v4'
+.*Error: illegal operands vd cannot overlap vm `vmsbf.m v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vmsif.m v4,v4'
+.*Error: illegal operands vd cannot overlap vm `vmsif.m v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vmsof.m v4,v4'
+.*Error: illegal operands vd cannot overlap vm `vmsof.m v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `viota.m v4,v4'
+.*Error: illegal operands vd cannot overlap vm `viota.m v0,v4,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vid.v v0,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-mask.s
new file mode 100644 (file)
index 0000000..99b19f6
--- /dev/null
@@ -0,0 +1,23 @@
+# Vector Set-before-first Mask Bit
+
+       vmsbf.m v4, v4          # vd overlap vs2
+       vmsbf.m v0, v4, v0.t    # vd overlap vm
+
+# Vector Set-including-first Mask Bit
+
+       vmsif.m v4, v4          # vd overlap vs2
+       vmsif.m v0, v4, v0.t    # vd overlap vm
+
+# Vector Set-only-first Mask Bit
+
+       vmsof.m v4, v4          # vd overlap vs2
+       vmsof.m v0, v4, v0.t    # vd overlap vm
+
+# Vector Iota Instruction
+
+       viota.m v4, v4          # vd overlap vs2
+       viota.m v0, v4, v0.t    # vd overlap vm
+
+# Vector Element Index Instruction
+
+       vid.v v0, v0.t          # vd overlap vm
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.d
new file mode 100644 (file)
index 0000000..9822e29
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32ifv -mcheck-constraints
+#source: vector-insns-fail-permutation.s
+#error_output: vector-insns-fail-permutation.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.l
new file mode 100644 (file)
index 0000000..1ea27f2
--- /dev/null
@@ -0,0 +1,31 @@
+.*: Assembler messages:
+.*Error: illegal operands vd cannot overlap vs2 `vslideup.vx v4,v4,a1'
+.*Error: illegal operands vd cannot overlap vm `vslideup.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vslideup.vi v4,v4,31'
+.*Error: illegal operands vd cannot overlap vm `vslideup.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vslidedown.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vslidedown.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vslide1up.vx v4,v4,a1'
+.*Error: illegal operands vd cannot overlap vm `vslide1up.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vfslide1up.vf v4,v4,fa1'
+.*Error: illegal operands vd cannot overlap vm `vfslide1up.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vslide1down.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vfslide1down.vf v0,v4,fa1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vrgather.vv v4,v4,v8'
+.*Error: illegal operands vd cannot overlap vs1 `vrgather.vv v8,v4,v8'
+.*Error: illegal operands vd cannot overlap vm `vrgather.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vrgather.vx v4,v4,a1'
+.*Error: illegal operands vd cannot overlap vm `vrgather.vx v0,v4,a1,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vrgather.vi v4,v4,31'
+.*Error: illegal operands vd cannot overlap vm `vrgather.vi v0,v4,31,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vrgatherei16.vv v4,v4,v8'
+.*Error: illegal operands vd cannot overlap vs1 `vrgatherei16.vv v8,v4,v8'
+.*Error: illegal operands vd cannot overlap vm `vrgatherei16.vv v0,v4,v8,v0.t'
+.*Error: illegal operands vd cannot overlap vs2 `vcompress.vm v4,v4,v8'
+.*Error: illegal operands vd cannot overlap vs1 `vcompress.vm v8,v4,v8'
+.*Error: illegal operands vs2 must be multiple of nf `vmv2r.v v30,v31'
+.*Error: illegal operands vd must be multiple of nf `vmv2r.v v31,v30'
+.*Error: illegal operands vs2 must be multiple of nf `vmv4r.v v28,v30'
+.*Error: illegal operands vd must be multiple of nf `vmv4r.v v30,v28'
+.*Error: illegal operands vs2 must be multiple of nf `vmv8r.v v24,v26'
+.*Error: illegal operands vd must be multiple of nf `vmv8r.v v26,v24'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-permutation.s
new file mode 100644 (file)
index 0000000..8511645
--- /dev/null
@@ -0,0 +1,56 @@
+# Vector Slideup Instructions
+
+       vslideup.vx v4, v4, a1                  # vd overlap vs2
+       vslideup.vx v0, v4, a1, v0.t            # vd overlap vm
+       vslideup.vi v4, v4, 31                  # vd overlap vs2
+       vslideup.vi v0, v4, 31, v0.t            # vd overlap vm
+
+       vslidedown.vx v4, v4, a1                # OK
+       vslidedown.vx v0, v4, a1, v0.t          # vd overlap vm
+       vslidedown.vi v4, v4, 31                # OK
+       vslidedown.vi v0, v4, 31, v0.t          # vd overlap vm
+
+       vslide1up.vx v4, v4, a1                 # vd overlap vs2
+       vslide1up.vx v0, v4, a1, v0.t           # vd overlap vm
+       vfslide1up.vf v4, v4, fa1               # vd overlap vs2
+       vfslide1up.vf v0, v4, fa1, v0.t         # vd overlap vm
+
+       vslide1down.vx v4, v4, a1               # OK
+       vslide1down.vx v0, v4, a1, v0.t         # vd overlap vm
+       vfslide1down.vf v4, v4, fa1             # OK
+       vfslide1down.vf v0, v4, fa1, v0.t       # vd overlap vm
+
+# Vector Register Gather Instructions
+
+       vrgather.vv v4, v4, v8                  # vd overlap vs2
+       vrgather.vv v8, v4, v8                  # vd overlap vs1
+       vrgather.vv v0, v4, v8, v0.t            # vd overlap vm
+       vrgather.vx v4, v4, a1                  # vd overlap vs2
+       vrgather.vx v0, v4, a1, v0.t            # vd overlap vm
+       vrgather.vi v4, v4, 31                  # vd overlap vs2
+       vrgather.vi v0, v4, 31, v0.t            # vd overlap vm
+
+       vrgatherei16.vv v4, v4, v8              # vd overlap vs2
+       vrgatherei16.vv v8, v4, v8              # vd overlap vs1
+       vrgatherei16.vv v0, v4, v8, v0.t        # vd overlap vm
+
+# Vector Compress Instruction
+
+       vcompress.vm v4, v4, v8                 # vd overlap vs2
+       vcompress.vm v8, v4, v8                 # vd overlap vs1
+
+# Whole Vector Register Move
+
+       vmv1r.v v31, v31                        # OK, HINT
+
+       vmv2r.v v30, v30                        # OK, HINT
+       vmv2r.v v30, v31                        # vs2 must be aligned to 2
+       vmv2r.v v31, v30                        # vd must be aligned to 2
+
+       vmv4r.v v28, v28                        # OK, HINT
+       vmv4r.v v28, v30                        # vs2 must be aligned to 4
+       vmv4r.v v30, v28                        # vd must be aligned to 4
+
+       vmv8r.v v24, v24                        # OK, HINT
+       vmv8r.v v24, v26                        # vs2 must be aligned to 8
+       vmv8r.v v26, v24                        # vd must be aligned to 8
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.d
new file mode 100644 (file)
index 0000000..5749449
--- /dev/null
@@ -0,0 +1,3 @@
+#as: -march=rv32iav -mcheck-constraints
+#source: vector-insns-fail-zvamo.s
+#error_output: vector-insns-fail-zvamo.l
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.l b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.l
new file mode 100644 (file)
index 0000000..ae414f7
--- /dev/null
@@ -0,0 +1,109 @@
+.*: Assembler messages:
+.*Error: illegal operands `vamoaddei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoswapei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoxorei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoandei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoandei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoandei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoorei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoorei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoorei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominuei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominuei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominuei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxuei8.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei8.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei8.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoaddei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoswapei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoxorei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoandei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoandei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoandei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoorei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoorei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoorei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominuei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominuei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominuei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei16.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxuei16.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei16.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoaddei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoswapei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoxorei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoandei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoandei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoandei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoorei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoorei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoorei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominuei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominuei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominuei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxuei32.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei32.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei32.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoaddei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoaddei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoswapei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoswapei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoxorei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoxorei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoandei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoandei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoandei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamoorei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamoorei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamoorei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamominuei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamominuei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamominuei64.v x0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands `vamomaxuei64.v v4,\(a1\),v4,v0'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei64.v v0,\(a1\),v4,v0,v0.t'
+.*Error: illegal operands vd cannot overlap vm `vamomaxuei64.v x0,\(a1\),v4,v0,v0.t'
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.s b/gas/testsuite/gas/riscv/extended/vector-insns-fail-zvamo.s
new file mode 100644 (file)
index 0000000..0fd3c26
--- /dev/null
@@ -0,0 +1,217 @@
+# Vector AMO Operations
+
+       vamoaddei8.v v0, (a1), v4, v0           # OK
+       vamoaddei8.v v4, (a1), v4, v0           # vd must match vs3
+       vamoaddei8.v v0, (a1), v4, v0, v0.t     # vd overlap vm
+       vamoaddei8.v x0, (a1), v4, v0           # OK
+       vamoaddei8.v x0, (a1), v4, v0, v0.t     # vs3 overlap vm
+
+       vamoswapei8.v v0, (a1), v4, v0
+       vamoswapei8.v v4, (a1), v4, v0
+       vamoswapei8.v v0, (a1), v4, v0, v0.t
+       vamoswapei8.v x0, (a1), v4, v0
+       vamoswapei8.v x0, (a1), v4, v0, v0.t
+
+       vamoxorei8.v v0, (a1), v4, v0
+       vamoxorei8.v v4, (a1), v4, v0
+       vamoxorei8.v v0, (a1), v4, v0, v0.t
+       vamoxorei8.v x0, (a1), v4, v0
+       vamoxorei8.v x0, (a1), v4, v0, v0.t
+
+       vamoandei8.v v0, (a1), v4, v0
+       vamoandei8.v v4, (a1), v4, v0
+       vamoandei8.v v0, (a1), v4, v0, v0.t
+       vamoandei8.v x0, (a1), v4, v0
+       vamoandei8.v x0, (a1), v4, v0, v0.t
+
+       vamoorei8.v v0, (a1), v4, v0
+       vamoorei8.v v4, (a1), v4, v0
+       vamoorei8.v v0, (a1), v4, v0, v0.t
+       vamoorei8.v x0, (a1), v4, v0
+       vamoorei8.v x0, (a1), v4, v0, v0.t
+
+       vamominei8.v v0, (a1), v4, v0
+       vamominei8.v v4, (a1), v4, v0
+       vamominei8.v v0, (a1), v4, v0, v0.t
+       vamominei8.v x0, (a1), v4, v0
+       vamominei8.v x0, (a1), v4, v0, v0.t
+
+       vamomaxei8.v v0, (a1), v4, v0
+       vamomaxei8.v v4, (a1), v4, v0
+       vamomaxei8.v v0, (a1), v4, v0, v0.t
+       vamomaxei8.v x0, (a1), v4, v0
+       vamomaxei8.v x0, (a1), v4, v0, v0.t
+
+       vamominuei8.v v0, (a1), v4, v0
+       vamominuei8.v v4, (a1), v4, v0
+       vamominuei8.v v0, (a1), v4, v0, v0.t
+       vamominuei8.v x0, (a1), v4, v0
+       vamominuei8.v x0, (a1), v4, v0, v0.t
+
+       vamomaxuei8.v v0, (a1), v4, v0
+       vamomaxuei8.v v4, (a1), v4, v0
+       vamomaxuei8.v v0, (a1), v4, v0, v0.t
+       vamomaxuei8.v x0, (a1), v4, v0
+       vamomaxuei8.v x0, (a1), v4, v0, v0.t
+
+       vamoaddei16.v v0, (a1), v4, v0
+       vamoaddei16.v v4, (a1), v4, v0
+       vamoaddei16.v v0, (a1), v4, v0, v0.t
+       vamoaddei16.v x0, (a1), v4, v0
+       vamoaddei16.v x0, (a1), v4, v0, v0.t
+
+       vamoswapei16.v v0, (a1), v4, v0
+       vamoswapei16.v v0, (a1), v4, v0, v0.t
+       vamoswapei16.v v4, (a1), v4, v0
+       vamoswapei16.v x0, (a1), v4, v0
+       vamoswapei16.v x0, (a1), v4, v0, v0.t
+
+       vamoxorei16.v v0, (a1), v4, v0
+       vamoxorei16.v v0, (a1), v4, v0, v0.t
+       vamoxorei16.v v4, (a1), v4, v0
+       vamoxorei16.v x0, (a1), v4, v0
+       vamoxorei16.v x0, (a1), v4, v0, v0.t
+
+       vamoandei16.v v0, (a1), v4, v0
+       vamoandei16.v v0, (a1), v4, v0, v0.t
+       vamoandei16.v v4, (a1), v4, v0
+       vamoandei16.v x0, (a1), v4, v0
+       vamoandei16.v x0, (a1), v4, v0, v0.t
+
+       vamoorei16.v v0, (a1), v4, v0
+       vamoorei16.v v0, (a1), v4, v0, v0.t
+       vamoorei16.v v4, (a1), v4, v0
+       vamoorei16.v x0, (a1), v4, v0
+       vamoorei16.v x0, (a1), v4, v0, v0.t
+
+       vamominei16.v v0, (a1), v4, v0
+       vamominei16.v v0, (a1), v4, v0, v0.t
+       vamominei16.v v4, (a1), v4, v0
+       vamominei16.v x0, (a1), v4, v0
+       vamominei16.v x0, (a1), v4, v0, v0.t
+
+       vamomaxei16.v v0, (a1), v4, v0
+       vamomaxei16.v v0, (a1), v4, v0, v0.t
+       vamomaxei16.v v4, (a1), v4, v0
+       vamomaxei16.v x0, (a1), v4, v0
+       vamomaxei16.v x0, (a1), v4, v0, v0.t
+
+       vamominuei16.v v0, (a1), v4, v0
+       vamominuei16.v v0, (a1), v4, v0, v0.t
+       vamominuei16.v v4, (a1), v4, v0
+       vamominuei16.v x0, (a1), v4, v0
+       vamominuei16.v x0, (a1), v4, v0, v0.t
+
+       vamomaxuei16.v v0, (a1), v4, v0
+       vamomaxuei16.v v0, (a1), v4, v0, v0.t
+       vamomaxuei16.v v4, (a1), v4, v0
+       vamomaxuei16.v x0, (a1), v4, v0
+       vamomaxuei16.v x0, (a1), v4, v0, v0.t
+
+       vamoaddei32.v v0, (a1), v4, v0
+       vamoaddei32.v v0, (a1), v4, v0, v0.t
+       vamoaddei32.v v4, (a1), v4, v0
+       vamoaddei32.v x0, (a1), v4, v0
+       vamoaddei32.v x0, (a1), v4, v0, v0.t
+
+       vamoswapei32.v v0, (a1), v4, v0
+       vamoswapei32.v v4, (a1), v4, v0
+       vamoswapei32.v v0, (a1), v4, v0, v0.t
+       vamoswapei32.v x0, (a1), v4, v0
+       vamoswapei32.v x0, (a1), v4, v0, v0.t
+
+       vamoxorei32.v v0, (a1), v4, v0
+       vamoxorei32.v v4, (a1), v4, v0
+       vamoxorei32.v v0, (a1), v4, v0, v0.t
+       vamoxorei32.v x0, (a1), v4, v0
+       vamoxorei32.v x0, (a1), v4, v0, v0.t
+
+       vamoandei32.v v0, (a1), v4, v0
+       vamoandei32.v v4, (a1), v4, v0
+       vamoandei32.v v0, (a1), v4, v0, v0.t
+       vamoandei32.v x0, (a1), v4, v0
+       vamoandei32.v x0, (a1), v4, v0, v0.t
+
+       vamoorei32.v v0, (a1), v4, v0
+       vamoorei32.v v4, (a1), v4, v0
+       vamoorei32.v v0, (a1), v4, v0, v0.t
+       vamoorei32.v x0, (a1), v4, v0
+       vamoorei32.v x0, (a1), v4, v0, v0.t
+
+       vamominei32.v v0, (a1), v4, v0
+       vamominei32.v v4, (a1), v4, v0
+       vamominei32.v v0, (a1), v4, v0, v0.t
+       vamominei32.v x0, (a1), v4, v0
+       vamominei32.v x0, (a1), v4, v0, v0.t
+
+       vamomaxei32.v v0, (a1), v4, v0
+       vamomaxei32.v v4, (a1), v4, v0
+       vamomaxei32.v v0, (a1), v4, v0, v0.t
+       vamomaxei32.v x0, (a1), v4, v0
+       vamomaxei32.v x0, (a1), v4, v0, v0.t
+
+       vamominuei32.v v0, (a1), v4, v0
+       vamominuei32.v v4, (a1), v4, v0
+       vamominuei32.v v0, (a1), v4, v0, v0.t
+       vamominuei32.v x0, (a1), v4, v0
+       vamominuei32.v x0, (a1), v4, v0, v0.t
+
+       vamomaxuei32.v v0, (a1), v4, v0
+       vamomaxuei32.v v4, (a1), v4, v0
+       vamomaxuei32.v v0, (a1), v4, v0, v0.t
+       vamomaxuei32.v x0, (a1), v4, v0
+       vamomaxuei32.v x0, (a1), v4, v0, v0.t
+
+       vamoaddei64.v v0, (a1), v4, v0
+       vamoaddei64.v v4, (a1), v4, v0
+       vamoaddei64.v v0, (a1), v4, v0, v0.t
+       vamoaddei64.v x0, (a1), v4, v0
+       vamoaddei64.v x0, (a1), v4, v0, v0.t
+
+       vamoswapei64.v v0, (a1), v4, v0
+       vamoswapei64.v v4, (a1), v4, v0
+       vamoswapei64.v v0, (a1), v4, v0, v0.t
+       vamoswapei64.v x0, (a1), v4, v0
+       vamoswapei64.v x0, (a1), v4, v0, v0.t
+
+       vamoxorei64.v v0, (a1), v4, v0
+       vamoxorei64.v v4, (a1), v4, v0
+       vamoxorei64.v v0, (a1), v4, v0, v0.t
+       vamoxorei64.v x0, (a1), v4, v0
+       vamoxorei64.v x0, (a1), v4, v0, v0.t
+
+       vamoandei64.v v0, (a1), v4, v0
+       vamoandei64.v v4, (a1), v4, v0
+       vamoandei64.v v0, (a1), v4, v0, v0.t
+       vamoandei64.v x0, (a1), v4, v0
+       vamoandei64.v x0, (a1), v4, v0, v0.t
+
+       vamoorei64.v v0, (a1), v4, v0
+       vamoorei64.v v4, (a1), v4, v0
+       vamoorei64.v v0, (a1), v4, v0, v0.t
+       vamoorei64.v x0, (a1), v4, v0
+       vamoorei64.v x0, (a1), v4, v0, v0.t
+
+       vamominei64.v v0, (a1), v4, v0
+       vamominei64.v v4, (a1), v4, v0
+       vamominei64.v v0, (a1), v4, v0, v0.t
+       vamominei64.v x0, (a1), v4, v0
+       vamominei64.v x0, (a1), v4, v0, v0.t
+
+       vamomaxei64.v v0, (a1), v4, v0
+       vamomaxei64.v v4, (a1), v4, v0
+       vamomaxei64.v v0, (a1), v4, v0, v0.t
+       vamomaxei64.v x0, (a1), v4, v0
+       vamomaxei64.v x0, (a1), v4, v0, v0.t
+
+       vamominuei64.v v0, (a1), v4, v0
+       vamominuei64.v v4, (a1), v4, v0
+       vamominuei64.v v0, (a1), v4, v0, v0.t
+       vamominuei64.v x0, (a1), v4, v0
+       vamominuei64.v x0, (a1), v4, v0, v0.t
+
+       vamomaxuei64.v v0, (a1), v4, v0
+       vamomaxuei64.v v4, (a1), v4, v0
+       vamomaxuei64.v v0, (a1), v4, v0, v0.t
+       vamomaxuei64.v x0, (a1), v4, v0
+       vamomaxuei64.v x0, (a1), v4, v0, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d b/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.d
new file mode 100644 (file)
index 0000000..4d33fe7
--- /dev/null
@@ -0,0 +1,29 @@
+#as: -march=rv32iv
+#objdump: -dr
+
+.*:[   ]+file format .*
+
+
+Disassembly of section .text:
+
+0+000 <.text>:
+[      ]+[0-9a-f]+:[   ]+6e85c257[     ]+vmslt.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+76422257[     ]+vmnot.m[      ]+v4,v4
+[      ]+[0-9a-f]+:[   ]+6cc64457[     ]+vmslt.vx[     ]+v8,v12,a2,v0.t
+[      ]+[0-9a-f]+:[   ]+6e802457[     ]+vmxor.mm[     ]+v8,v8,v0
+[      ]+[0-9a-f]+:[   ]+6c85c657[     ]+vmslt.vx[     ]+v12,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+62062057[     ]+vmandnot.mm[  ]+v0,v0,v12
+[      ]+[0-9a-f]+:[   ]+6c85c657[     ]+vmslt.vx[     ]+v12,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+62062657[     ]+vmandnot.mm[  ]+v12,v0,v12
+[      ]+[0-9a-f]+:[   ]+62402257[     ]+vmandnot.mm[  ]+v4,v4,v0
+[      ]+[0-9a-f]+:[   ]+6ac22257[     ]+vmor.mm[      ]+v4,v12,v4
+[      ]+[0-9a-f]+:[   ]+6a85c257[     ]+vmsltu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+76422257[     ]+vmnot.m[      ]+v4,v4
+[      ]+[0-9a-f]+:[   ]+68c64457[     ]+vmsltu.vx[    ]+v8,v12,a2,v0.t
+[      ]+[0-9a-f]+:[   ]+6e802457[     ]+vmxor.mm[     ]+v8,v8,v0
+[      ]+[0-9a-f]+:[   ]+6885c657[     ]+vmsltu.vx[    ]+v12,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+62062057[     ]+vmandnot.mm[  ]+v0,v0,v12
+[      ]+[0-9a-f]+:[   ]+6885c657[     ]+vmsltu.vx[    ]+v12,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+62062657[     ]+vmandnot.mm[  ]+v12,v0,v12
+[      ]+[0-9a-f]+:[   ]+62402257[     ]+vmandnot.mm[  ]+v4,v4,v0
+[      ]+[0-9a-f]+:[   ]+6ac22257[     ]+vmor.mm[      ]+v4,v12,v4
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.s b/gas/testsuite/gas/riscv/extended/vector-insns-vmsgtvx.s
new file mode 100644 (file)
index 0000000..afbb7cc
--- /dev/null
@@ -0,0 +1,9 @@
+       vmsge.vx v4, v8, a1             # unmasked va >= x
+       vmsge.vx v8, v12, a2, v0.t      # masked va >= x, vd != v0
+       vmsge.vx v0, v8, a1, v0.t, v12  # masked va >= x, vd == v0
+       vmsge.vx v4, v8, a1, v0.t, v12  # masked va >= x, any vd
+
+       vmsgeu.vx v4, v8, a1            # unmasked va >= x
+       vmsgeu.vx v8, v12, a2, v0.t     # masked va >= x, vd != v0
+       vmsgeu.vx v0, v8, a1, v0.t, v12 # masked va >= x, vd == v0
+       vmsgeu.vx v4, v8, a1, v0.t, v12 # masked va >= x, any vd
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.d b/gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.d
new file mode 100644 (file)
index 0000000..f6fe2ff
--- /dev/null
@@ -0,0 +1,17 @@
+#as: -march=rv32ifv
+#objdump: -dr
+
+.*:[   ]+file format .*
+
+
+Disassembly of section .text:
+
+0+000 <.text>:
+[      ]+[0-9a-f]+:[   ]+768fb257[     ]+vmsle.vi[     ]+v4,v8,-1
+[      ]+[0-9a-f]+:[   ]+748fb257[     ]+vmsle.vi[     ]+v4,v8,-1,v0.t
+[      ]+[0-9a-f]+:[   ]+66840257[     ]+vmsne.vv[     ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+64840257[     ]+vmsne.vv[     ]+v4,v8,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+7e8fb257[     ]+vmsgt.vi[     ]+v4,v8,-1
+[      ]+[0-9a-f]+:[   ]+7c8fb257[     ]+vmsgt.vi[     ]+v4,v8,-1,v0.t
+[      ]+[0-9a-f]+:[   ]+62840257[     ]+vmseq.vv[     ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+60840257[     ]+vmseq.vv[     ]+v4,v8,v8,v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.s b/gas/testsuite/gas/riscv/extended/vector-insns-zero-imm.s
new file mode 100644 (file)
index 0000000..98b7063
--- /dev/null
@@ -0,0 +1,8 @@
+       vmslt.vi v4, v8, 0
+       vmslt.vi v4, v8, 0, v0.t
+       vmsltu.vi v4, v8, 0
+       vmsltu.vi v4, v8, 0, v0.t
+       vmsge.vi v4, v8, 0
+       vmsge.vi v4, v8, 0, v0.t
+       vmsgeu.vi v4, v8, 0
+       vmsgeu.vi v4, v8, 0, v0.t
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.d b/gas/testsuite/gas/riscv/extended/vector-insns.d
new file mode 100644 (file)
index 0000000..01770c4
--- /dev/null
@@ -0,0 +1,1942 @@
+#as: -march=rv32iafv
+#objdump: -dr
+
+.*:[   ]+file format .*
+
+
+Disassembly of section .text:
+
+0+000 <.text>:
+[      ]+[0-9a-f]+:[   ]+80c5f557[     ]+vsetvl[       ]+a0,a1,a2
+[      ]+[0-9a-f]+:[   ]+0005f557[     ]+vsetvli[      ]+a0,a1,e8,m1,tu,mu
+[      ]+[0-9a-f]+:[   ]+7ff5f557[     ]+vsetvli[      ]+a0,a1,2047
+[      ]+[0-9a-f]+:[   ]+0095f557[     ]+vsetvli[      ]+a0,a1,e16,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+02b5f557[     ]+vsetvli[      ]+a0,a1,e256,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+0335f557[     ]+vsetvli[      ]+a0,a1,e512,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+03b5f557[     ]+vsetvli[      ]+a0,a1,e1024,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+0385f557[     ]+vsetvli[      ]+a0,a1,e1024,m1,tu,mu
+[      ]+[0-9a-f]+:[   ]+03f5f557[     ]+vsetvli[      ]+a0,a1,e1024,mf2,tu,mu
+[      ]+[0-9a-f]+:[   ]+0365f557[     ]+vsetvli[      ]+a0,a1,e512,mf4,tu,mu
+[      ]+[0-9a-f]+:[   ]+02d5f557[     ]+vsetvli[      ]+a0,a1,e256,mf8,tu,mu
+[      ]+[0-9a-f]+:[   ]+0695f557[     ]+vsetvli[      ]+a0,a1,e256,m2,ta,mu
+[      ]+[0-9a-f]+:[   ]+0a95f557[     ]+vsetvli[      ]+a0,a1,e256,m2,tu,ma
+[      ]+[0-9a-f]+:[   ]+0295f557[     ]+vsetvli[      ]+a0,a1,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+0295f557[     ]+vsetvli[      ]+a0,a1,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+0e95f557[     ]+vsetvli[      ]+a0,a1,e256,m2,ta,ma
+[      ]+[0-9a-f]+:[   ]+0a95f557[     ]+vsetvli[      ]+a0,a1,e256,m2,tu,ma
+[      ]+[0-9a-f]+:[   ]+0695f557[     ]+vsetvli[      ]+a0,a1,e256,m2,ta,mu
+[      ]+[0-9a-f]+:[   ]+0295f557[     ]+vsetvli[      ]+a0,a1,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+c005f557[     ]+vsetivli[     ]+a0,11,e8,m1,tu,mu
+[      ]+[0-9a-f]+:[   ]+fff5f557[     ]+vsetivli[     ]+a0,11,e1024,mf2,ta,ma
+[      ]+[0-9a-f]+:[   ]+c095f557[     ]+vsetivli[     ]+a0,11,e16,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+c2b5f557[     ]+vsetivli[     ]+a0,11,e256,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+c335f557[     ]+vsetivli[     ]+a0,11,e512,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+c3b5f557[     ]+vsetivli[     ]+a0,11,e1024,m8,tu,mu
+[      ]+[0-9a-f]+:[   ]+c385f557[     ]+vsetivli[     ]+a0,11,e1024,m1,tu,mu
+[      ]+[0-9a-f]+:[   ]+c3f5f557[     ]+vsetivli[     ]+a0,11,e1024,mf2,tu,mu
+[      ]+[0-9a-f]+:[   ]+c365f557[     ]+vsetivli[     ]+a0,11,e512,mf4,tu,mu
+[      ]+[0-9a-f]+:[   ]+c2d5f557[     ]+vsetivli[     ]+a0,11,e256,mf8,tu,mu
+[      ]+[0-9a-f]+:[   ]+c695f557[     ]+vsetivli[     ]+a0,11,e256,m2,ta,mu
+[      ]+[0-9a-f]+:[   ]+ca95f557[     ]+vsetivli[     ]+a0,11,e256,m2,tu,ma
+[      ]+[0-9a-f]+:[   ]+c295f557[     ]+vsetivli[     ]+a0,11,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+c295f557[     ]+vsetivli[     ]+a0,11,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+ce95f557[     ]+vsetivli[     ]+a0,11,e256,m2,ta,ma
+[      ]+[0-9a-f]+:[   ]+ca95f557[     ]+vsetivli[     ]+a0,11,e256,m2,tu,ma
+[      ]+[0-9a-f]+:[   ]+c695f557[     ]+vsetivli[     ]+a0,11,e256,m2,ta,mu
+[      ]+[0-9a-f]+:[   ]+c295f557[     ]+vsetivli[     ]+a0,11,e256,m2,tu,mu
+[      ]+[0-9a-f]+:[   ]+02b50207[     ]+vle1.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02b50207[     ]+vle1.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02b50227[     ]+vse1.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02b50227[     ]+vse1.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02050207[     ]+vle8.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02050207[     ]+vle8.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00050207[     ]+vle8.v[       ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02050227[     ]+vse8.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02050227[     ]+vse8.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00050227[     ]+vse8.v[       ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02055207[     ]+vle16.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02055207[     ]+vle16.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00055207[     ]+vle16.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02055227[     ]+vse16.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02055227[     ]+vse16.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00055227[     ]+vse16.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02056207[     ]+vle32.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02056207[     ]+vle32.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00056207[     ]+vle32.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02056227[     ]+vse32.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02056227[     ]+vse32.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00056227[     ]+vse32.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02057207[     ]+vle64.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02057207[     ]+vle64.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00057207[     ]+vle64.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02057227[     ]+vse64.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02057227[     ]+vse64.v[      ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+00057227[     ]+vse64.v[      ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+0ab50207[     ]+vlse8.v[      ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab50207[     ]+vlse8.v[      ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b50207[     ]+vlse8.v[      ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab50227[     ]+vsse8.v[      ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab50227[     ]+vsse8.v[      ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b50227[     ]+vsse8.v[      ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab55207[     ]+vlse16.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab55207[     ]+vlse16.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b55207[     ]+vlse16.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab55227[     ]+vsse16.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab55227[     ]+vsse16.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b55227[     ]+vsse16.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab56207[     ]+vlse32.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab56207[     ]+vlse32.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b56207[     ]+vlse32.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab56227[     ]+vsse32.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab56227[     ]+vsse32.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b56227[     ]+vsse32.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab57207[     ]+vlse64.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab57207[     ]+vlse64.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b57207[     ]+vlse64.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ab57227[     ]+vsse64.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+0ab57227[     ]+vsse64.v[     ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+08b57227[     ]+vsse64.v[     ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec50207[     ]+vloxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec50207[     ]+vloxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc50207[     ]+vloxei8.v[    ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec50227[     ]+vsoxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec50227[     ]+vsoxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc50227[     ]+vsoxei8.v[    ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c50207[     ]+vluxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c50207[     ]+vluxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c50207[     ]+vluxei8.v[    ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c50227[     ]+vsuxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c50227[     ]+vsuxei8.v[    ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c50227[     ]+vsuxei8.v[    ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec55207[     ]+vloxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec55207[     ]+vloxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc55207[     ]+vloxei16.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec55227[     ]+vsoxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec55227[     ]+vsoxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc55227[     ]+vsoxei16.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c55207[     ]+vluxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c55207[     ]+vluxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c55207[     ]+vluxei16.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c55227[     ]+vsuxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c55227[     ]+vsuxei16.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c55227[     ]+vsuxei16.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec56207[     ]+vloxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec56207[     ]+vloxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc56207[     ]+vloxei32.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec56227[     ]+vsoxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec56227[     ]+vsoxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc56227[     ]+vsoxei32.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c56207[     ]+vluxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c56207[     ]+vluxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c56207[     ]+vluxei32.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c56227[     ]+vsuxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c56227[     ]+vsuxei32.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c56227[     ]+vsuxei32.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec57207[     ]+vloxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec57207[     ]+vloxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc57207[     ]+vloxei64.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0ec57227[     ]+vsoxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0ec57227[     ]+vsoxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+0cc57227[     ]+vsoxei64.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c57207[     ]+vluxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c57207[     ]+vluxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c57207[     ]+vluxei64.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+06c57227[     ]+vsuxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+06c57227[     ]+vsuxei64.v[   ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+04c57227[     ]+vsuxei64.v[   ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+03050207[     ]+vle8ff.v[     ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+03050207[     ]+vle8ff.v[     ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+01050207[     ]+vle8ff.v[     ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+03055207[     ]+vle16ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+03055207[     ]+vle16ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+01055207[     ]+vle16ff.v[    ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+03056207[     ]+vle32ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+03056207[     ]+vle32ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+01056207[     ]+vle32ff.v[    ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+03057207[     ]+vle64ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+03057207[     ]+vle64ff.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+01057207[     ]+vle64ff.v[    ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22050207[     ]+vlseg2e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22050207[     ]+vlseg2e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20050207[     ]+vlseg2e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22050227[     ]+vsseg2e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22050227[     ]+vsseg2e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20050227[     ]+vsseg2e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42050207[     ]+vlseg3e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42050207[     ]+vlseg3e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40050207[     ]+vlseg3e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42050227[     ]+vsseg3e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42050227[     ]+vsseg3e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40050227[     ]+vsseg3e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62050207[     ]+vlseg4e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62050207[     ]+vlseg4e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60050207[     ]+vlseg4e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62050227[     ]+vsseg4e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62050227[     ]+vsseg4e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60050227[     ]+vsseg4e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82050207[     ]+vlseg5e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82050207[     ]+vlseg5e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80050207[     ]+vlseg5e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82050227[     ]+vsseg5e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82050227[     ]+vsseg5e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80050227[     ]+vsseg5e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2050207[     ]+vlseg6e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2050207[     ]+vlseg6e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0050207[     ]+vlseg6e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2050227[     ]+vsseg6e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2050227[     ]+vsseg6e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0050227[     ]+vsseg6e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2050207[     ]+vlseg7e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2050207[     ]+vlseg7e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0050207[     ]+vlseg7e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2050227[     ]+vsseg7e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2050227[     ]+vsseg7e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0050227[     ]+vsseg7e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2050207[     ]+vlseg8e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2050207[     ]+vlseg8e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0050207[     ]+vlseg8e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2050227[     ]+vsseg8e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2050227[     ]+vsseg8e8.v[   ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0050227[     ]+vsseg8e8.v[   ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22055207[     ]+vlseg2e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22055207[     ]+vlseg2e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20055207[     ]+vlseg2e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22055227[     ]+vsseg2e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22055227[     ]+vsseg2e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20055227[     ]+vsseg2e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42055207[     ]+vlseg3e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42055207[     ]+vlseg3e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40055207[     ]+vlseg3e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42055227[     ]+vsseg3e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42055227[     ]+vsseg3e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40055227[     ]+vsseg3e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62055207[     ]+vlseg4e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62055207[     ]+vlseg4e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60055207[     ]+vlseg4e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62055227[     ]+vsseg4e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62055227[     ]+vsseg4e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60055227[     ]+vsseg4e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82055207[     ]+vlseg5e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82055207[     ]+vlseg5e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80055207[     ]+vlseg5e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82055227[     ]+vsseg5e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82055227[     ]+vsseg5e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80055227[     ]+vsseg5e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2055207[     ]+vlseg6e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2055207[     ]+vlseg6e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0055207[     ]+vlseg6e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2055227[     ]+vsseg6e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2055227[     ]+vsseg6e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0055227[     ]+vsseg6e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2055207[     ]+vlseg7e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2055207[     ]+vlseg7e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0055207[     ]+vlseg7e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2055227[     ]+vsseg7e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2055227[     ]+vsseg7e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0055227[     ]+vsseg7e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2055207[     ]+vlseg8e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2055207[     ]+vlseg8e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0055207[     ]+vlseg8e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2055227[     ]+vsseg8e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2055227[     ]+vsseg8e16.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0055227[     ]+vsseg8e16.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22056207[     ]+vlseg2e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22056207[     ]+vlseg2e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20056207[     ]+vlseg2e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22056227[     ]+vsseg2e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22056227[     ]+vsseg2e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20056227[     ]+vsseg2e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42056207[     ]+vlseg3e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42056207[     ]+vlseg3e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40056207[     ]+vlseg3e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42056227[     ]+vsseg3e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42056227[     ]+vsseg3e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40056227[     ]+vsseg3e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62056207[     ]+vlseg4e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62056207[     ]+vlseg4e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60056207[     ]+vlseg4e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62056227[     ]+vsseg4e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62056227[     ]+vsseg4e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60056227[     ]+vsseg4e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82056207[     ]+vlseg5e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82056207[     ]+vlseg5e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80056207[     ]+vlseg5e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82056227[     ]+vsseg5e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82056227[     ]+vsseg5e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80056227[     ]+vsseg5e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2056207[     ]+vlseg6e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2056207[     ]+vlseg6e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0056207[     ]+vlseg6e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2056227[     ]+vsseg6e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2056227[     ]+vsseg6e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0056227[     ]+vsseg6e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2056207[     ]+vlseg7e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2056207[     ]+vlseg7e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0056207[     ]+vlseg7e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2056227[     ]+vsseg7e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2056227[     ]+vsseg7e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0056227[     ]+vsseg7e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2056207[     ]+vlseg8e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2056207[     ]+vlseg8e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0056207[     ]+vlseg8e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2056227[     ]+vsseg8e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2056227[     ]+vsseg8e32.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0056227[     ]+vsseg8e32.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22057207[     ]+vlseg2e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22057207[     ]+vlseg2e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20057207[     ]+vlseg2e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+22057227[     ]+vsseg2e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22057227[     ]+vsseg2e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+20057227[     ]+vsseg2e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42057207[     ]+vlseg3e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42057207[     ]+vlseg3e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40057207[     ]+vlseg3e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+42057227[     ]+vsseg3e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+42057227[     ]+vsseg3e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+40057227[     ]+vsseg3e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62057207[     ]+vlseg4e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62057207[     ]+vlseg4e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60057207[     ]+vlseg4e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+62057227[     ]+vsseg4e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62057227[     ]+vsseg4e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+60057227[     ]+vsseg4e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82057207[     ]+vlseg5e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82057207[     ]+vlseg5e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80057207[     ]+vlseg5e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+82057227[     ]+vsseg5e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+82057227[     ]+vsseg5e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+80057227[     ]+vsseg5e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2057207[     ]+vlseg6e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2057207[     ]+vlseg6e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0057207[     ]+vlseg6e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a2057227[     ]+vsseg6e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a2057227[     ]+vsseg6e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a0057227[     ]+vsseg6e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2057207[     ]+vlseg7e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2057207[     ]+vlseg7e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0057207[     ]+vlseg7e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c2057227[     ]+vsseg7e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c2057227[     ]+vsseg7e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c0057227[     ]+vsseg7e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2057207[     ]+vlseg8e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2057207[     ]+vlseg8e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0057207[     ]+vlseg8e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e2057227[     ]+vsseg8e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2057227[     ]+vsseg8e64.v[  ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e0057227[     ]+vsseg8e64.v[  ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+2ab50207[     ]+vlsseg2e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab50207[     ]+vlsseg2e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b50207[     ]+vlsseg2e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab50227[     ]+vssseg2e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab50227[     ]+vssseg2e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b50227[     ]+vssseg2e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab50207[     ]+vlsseg3e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab50207[     ]+vlsseg3e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b50207[     ]+vlsseg3e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab50227[     ]+vssseg3e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab50227[     ]+vssseg3e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b50227[     ]+vssseg3e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab50207[     ]+vlsseg4e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab50207[     ]+vlsseg4e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b50207[     ]+vlsseg4e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab50227[     ]+vssseg4e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab50227[     ]+vssseg4e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b50227[     ]+vssseg4e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab50207[     ]+vlsseg5e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab50207[     ]+vlsseg5e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b50207[     ]+vlsseg5e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab50227[     ]+vssseg5e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab50227[     ]+vssseg5e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b50227[     ]+vssseg5e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab50207[     ]+vlsseg6e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab50207[     ]+vlsseg6e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b50207[     ]+vlsseg6e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab50227[     ]+vssseg6e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab50227[     ]+vssseg6e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b50227[     ]+vssseg6e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab50207[     ]+vlsseg7e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab50207[     ]+vlsseg7e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b50207[     ]+vlsseg7e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab50227[     ]+vssseg7e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab50227[     ]+vssseg7e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b50227[     ]+vssseg7e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab50207[     ]+vlsseg8e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab50207[     ]+vlsseg8e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b50207[     ]+vlsseg8e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab50227[     ]+vssseg8e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab50227[     ]+vssseg8e8.v[  ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b50227[     ]+vssseg8e8.v[  ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab55207[     ]+vlsseg2e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab55207[     ]+vlsseg2e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b55207[     ]+vlsseg2e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab55227[     ]+vssseg2e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab55227[     ]+vssseg2e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b55227[     ]+vssseg2e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab55207[     ]+vlsseg3e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab55207[     ]+vlsseg3e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b55207[     ]+vlsseg3e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab55227[     ]+vssseg3e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab55227[     ]+vssseg3e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b55227[     ]+vssseg3e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab55207[     ]+vlsseg4e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab55207[     ]+vlsseg4e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b55207[     ]+vlsseg4e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab55227[     ]+vssseg4e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab55227[     ]+vssseg4e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b55227[     ]+vssseg4e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab55207[     ]+vlsseg5e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab55207[     ]+vlsseg5e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b55207[     ]+vlsseg5e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab55227[     ]+vssseg5e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab55227[     ]+vssseg5e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b55227[     ]+vssseg5e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab55207[     ]+vlsseg6e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab55207[     ]+vlsseg6e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b55207[     ]+vlsseg6e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab55227[     ]+vssseg6e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab55227[     ]+vssseg6e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b55227[     ]+vssseg6e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab55207[     ]+vlsseg7e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab55207[     ]+vlsseg7e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b55207[     ]+vlsseg7e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab55227[     ]+vssseg7e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab55227[     ]+vssseg7e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b55227[     ]+vssseg7e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab55207[     ]+vlsseg8e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab55207[     ]+vlsseg8e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b55207[     ]+vlsseg8e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab55227[     ]+vssseg8e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab55227[     ]+vssseg8e16.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b55227[     ]+vssseg8e16.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab56207[     ]+vlsseg2e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab56207[     ]+vlsseg2e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b56207[     ]+vlsseg2e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab56227[     ]+vssseg2e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab56227[     ]+vssseg2e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b56227[     ]+vssseg2e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab56207[     ]+vlsseg3e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab56207[     ]+vlsseg3e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b56207[     ]+vlsseg3e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab56227[     ]+vssseg3e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab56227[     ]+vssseg3e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b56227[     ]+vssseg3e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab56207[     ]+vlsseg4e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab56207[     ]+vlsseg4e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b56207[     ]+vlsseg4e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab56227[     ]+vssseg4e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab56227[     ]+vssseg4e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b56227[     ]+vssseg4e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab56207[     ]+vlsseg5e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab56207[     ]+vlsseg5e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b56207[     ]+vlsseg5e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab56227[     ]+vssseg5e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab56227[     ]+vssseg5e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b56227[     ]+vssseg5e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab56207[     ]+vlsseg6e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab56207[     ]+vlsseg6e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b56207[     ]+vlsseg6e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab56227[     ]+vssseg6e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab56227[     ]+vssseg6e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b56227[     ]+vssseg6e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab56207[     ]+vlsseg7e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab56207[     ]+vlsseg7e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b56207[     ]+vlsseg7e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab56227[     ]+vssseg7e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab56227[     ]+vssseg7e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b56227[     ]+vssseg7e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab56207[     ]+vlsseg8e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab56207[     ]+vlsseg8e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b56207[     ]+vlsseg8e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab56227[     ]+vssseg8e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab56227[     ]+vssseg8e32.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b56227[     ]+vssseg8e32.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab57207[     ]+vlsseg2e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab57207[     ]+vlsseg2e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b57207[     ]+vlsseg2e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ab57227[     ]+vssseg2e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+2ab57227[     ]+vssseg2e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+28b57227[     ]+vssseg2e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab57207[     ]+vlsseg3e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab57207[     ]+vlsseg3e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b57207[     ]+vlsseg3e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4ab57227[     ]+vssseg3e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+4ab57227[     ]+vssseg3e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+48b57227[     ]+vssseg3e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab57207[     ]+vlsseg4e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab57207[     ]+vlsseg4e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b57207[     ]+vlsseg4e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6ab57227[     ]+vssseg4e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+6ab57227[     ]+vssseg4e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+68b57227[     ]+vssseg4e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab57207[     ]+vlsseg5e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab57207[     ]+vlsseg5e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b57207[     ]+vlsseg5e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8ab57227[     ]+vssseg5e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+8ab57227[     ]+vssseg5e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+88b57227[     ]+vssseg5e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab57207[     ]+vlsseg6e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab57207[     ]+vlsseg6e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b57207[     ]+vlsseg6e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aab57227[     ]+vssseg6e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+aab57227[     ]+vssseg6e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+a8b57227[     ]+vssseg6e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab57207[     ]+vlsseg7e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab57207[     ]+vlsseg7e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b57207[     ]+vlsseg7e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+cab57227[     ]+vssseg7e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+cab57227[     ]+vssseg7e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+c8b57227[     ]+vssseg7e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab57207[     ]+vlsseg8e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab57207[     ]+vlsseg8e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b57207[     ]+vlsseg8e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+eab57227[     ]+vssseg8e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+eab57227[     ]+vssseg8e64.v[         ]+v4,\(a0\),a1
+[      ]+[0-9a-f]+:[   ]+e8b57227[     ]+vssseg8e64.v[         ]+v4,\(a0\),a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec50207[     ]+vloxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec50207[     ]+vloxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc50207[     ]+vloxseg2ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec50227[     ]+vsoxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec50227[     ]+vsoxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc50227[     ]+vsoxseg2ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec50207[     ]+vloxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec50207[     ]+vloxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc50207[     ]+vloxseg3ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec50227[     ]+vsoxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec50227[     ]+vsoxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc50227[     ]+vsoxseg3ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec50207[     ]+vloxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec50207[     ]+vloxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc50207[     ]+vloxseg4ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec50227[     ]+vsoxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec50227[     ]+vsoxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc50227[     ]+vsoxseg4ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec50207[     ]+vloxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec50207[     ]+vloxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc50207[     ]+vloxseg5ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec50227[     ]+vsoxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec50227[     ]+vsoxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc50227[     ]+vsoxseg5ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec50207[     ]+vloxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec50207[     ]+vloxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc50207[     ]+vloxseg6ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec50227[     ]+vsoxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec50227[     ]+vsoxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc50227[     ]+vsoxseg6ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec50207[     ]+vloxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec50207[     ]+vloxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc50207[     ]+vloxseg7ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec50227[     ]+vsoxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec50227[     ]+vsoxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc50227[     ]+vsoxseg7ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec50207[     ]+vloxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec50207[     ]+vloxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc50207[     ]+vloxseg8ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec50227[     ]+vsoxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec50227[     ]+vsoxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc50227[     ]+vsoxseg8ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec55207[     ]+vloxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec55207[     ]+vloxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc55207[     ]+vloxseg2ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec55227[     ]+vsoxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec55227[     ]+vsoxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc55227[     ]+vsoxseg2ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec55207[     ]+vloxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec55207[     ]+vloxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc55207[     ]+vloxseg3ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec55227[     ]+vsoxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec55227[     ]+vsoxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc55227[     ]+vsoxseg3ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec55207[     ]+vloxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec55207[     ]+vloxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc55207[     ]+vloxseg4ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec55227[     ]+vsoxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec55227[     ]+vsoxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc55227[     ]+vsoxseg4ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec55207[     ]+vloxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec55207[     ]+vloxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc55207[     ]+vloxseg5ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec55227[     ]+vsoxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec55227[     ]+vsoxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc55227[     ]+vsoxseg5ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec55207[     ]+vloxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec55207[     ]+vloxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc55207[     ]+vloxseg6ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec55227[     ]+vsoxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec55227[     ]+vsoxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc55227[     ]+vsoxseg6ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec55207[     ]+vloxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec55207[     ]+vloxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc55207[     ]+vloxseg7ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec55227[     ]+vsoxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec55227[     ]+vsoxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc55227[     ]+vsoxseg7ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec55207[     ]+vloxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec55207[     ]+vloxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc55207[     ]+vloxseg8ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec55227[     ]+vsoxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec55227[     ]+vsoxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc55227[     ]+vsoxseg8ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec56207[     ]+vloxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec56207[     ]+vloxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc56207[     ]+vloxseg2ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec56227[     ]+vsoxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec56227[     ]+vsoxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc56227[     ]+vsoxseg2ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec56207[     ]+vloxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec56207[     ]+vloxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc56207[     ]+vloxseg3ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec56227[     ]+vsoxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec56227[     ]+vsoxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc56227[     ]+vsoxseg3ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec56207[     ]+vloxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec56207[     ]+vloxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc56207[     ]+vloxseg4ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec56227[     ]+vsoxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec56227[     ]+vsoxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc56227[     ]+vsoxseg4ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec56207[     ]+vloxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec56207[     ]+vloxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc56207[     ]+vloxseg5ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec56227[     ]+vsoxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec56227[     ]+vsoxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc56227[     ]+vsoxseg5ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec56207[     ]+vloxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec56207[     ]+vloxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc56207[     ]+vloxseg6ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec56227[     ]+vsoxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec56227[     ]+vsoxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc56227[     ]+vsoxseg6ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec56207[     ]+vloxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec56207[     ]+vloxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc56207[     ]+vloxseg7ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec56227[     ]+vsoxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec56227[     ]+vsoxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc56227[     ]+vsoxseg7ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec56207[     ]+vloxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec56207[     ]+vloxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc56207[     ]+vloxseg8ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec56227[     ]+vsoxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec56227[     ]+vsoxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc56227[     ]+vsoxseg8ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec57207[     ]+vloxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec57207[     ]+vloxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc57207[     ]+vloxseg2ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2ec57227[     ]+vsoxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2ec57227[     ]+vsoxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+2cc57227[     ]+vsoxseg2ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec57207[     ]+vloxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec57207[     ]+vloxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc57207[     ]+vloxseg3ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+4ec57227[     ]+vsoxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4ec57227[     ]+vsoxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+4cc57227[     ]+vsoxseg3ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec57207[     ]+vloxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec57207[     ]+vloxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc57207[     ]+vloxseg4ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec57227[     ]+vsoxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6ec57227[     ]+vsoxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+6cc57227[     ]+vsoxseg4ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec57207[     ]+vloxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec57207[     ]+vloxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc57207[     ]+vloxseg5ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8ec57227[     ]+vsoxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8ec57227[     ]+vsoxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+8cc57227[     ]+vsoxseg5ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec57207[     ]+vloxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec57207[     ]+vloxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc57207[     ]+vloxseg6ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+aec57227[     ]+vsoxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+aec57227[     ]+vsoxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+acc57227[     ]+vsoxseg6ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec57207[     ]+vloxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec57207[     ]+vloxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc57207[     ]+vloxseg7ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cec57227[     ]+vsoxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+cec57227[     ]+vsoxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ccc57227[     ]+vsoxseg7ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec57207[     ]+vloxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec57207[     ]+vloxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc57207[     ]+vloxseg8ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+eec57227[     ]+vsoxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+eec57227[     ]+vsoxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+ecc57227[     ]+vsoxseg8ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c50207[     ]+vluxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c50207[     ]+vluxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c50207[     ]+vluxseg2ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c50227[     ]+vsuxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c50227[     ]+vsuxseg2ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c50227[     ]+vsuxseg2ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c50207[     ]+vluxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c50207[     ]+vluxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c50207[     ]+vluxseg3ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c50227[     ]+vsuxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c50227[     ]+vsuxseg3ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c50227[     ]+vsuxseg3ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c50207[     ]+vluxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c50207[     ]+vluxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c50207[     ]+vluxseg4ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c50227[     ]+vsuxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c50227[     ]+vsuxseg4ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c50227[     ]+vsuxseg4ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c50207[     ]+vluxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c50207[     ]+vluxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c50207[     ]+vluxseg5ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c50227[     ]+vsuxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c50227[     ]+vsuxseg5ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c50227[     ]+vsuxseg5ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c50207[     ]+vluxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c50207[     ]+vluxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c50207[     ]+vluxseg6ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c50227[     ]+vsuxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c50227[     ]+vsuxseg6ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c50227[     ]+vsuxseg6ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c50207[     ]+vluxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c50207[     ]+vluxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c50207[     ]+vluxseg7ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c50227[     ]+vsuxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c50227[     ]+vsuxseg7ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c50227[     ]+vsuxseg7ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c50207[     ]+vluxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c50207[     ]+vluxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c50207[     ]+vluxseg8ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c50227[     ]+vsuxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c50227[     ]+vsuxseg8ei8.v[        ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c50227[     ]+vsuxseg8ei8.v[        ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c55207[     ]+vluxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c55207[     ]+vluxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c55207[     ]+vluxseg2ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c55227[     ]+vsuxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c55227[     ]+vsuxseg2ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c55227[     ]+vsuxseg2ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c55207[     ]+vluxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c55207[     ]+vluxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c55207[     ]+vluxseg3ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c55227[     ]+vsuxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c55227[     ]+vsuxseg3ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c55227[     ]+vsuxseg3ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c55207[     ]+vluxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c55207[     ]+vluxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c55207[     ]+vluxseg4ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c55227[     ]+vsuxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c55227[     ]+vsuxseg4ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c55227[     ]+vsuxseg4ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c55207[     ]+vluxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c55207[     ]+vluxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c55207[     ]+vluxseg5ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c55227[     ]+vsuxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c55227[     ]+vsuxseg5ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c55227[     ]+vsuxseg5ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c55207[     ]+vluxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c55207[     ]+vluxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c55207[     ]+vluxseg6ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c55227[     ]+vsuxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c55227[     ]+vsuxseg6ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c55227[     ]+vsuxseg6ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c55207[     ]+vluxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c55207[     ]+vluxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c55207[     ]+vluxseg7ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c55227[     ]+vsuxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c55227[     ]+vsuxseg7ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c55227[     ]+vsuxseg7ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c55207[     ]+vluxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c55207[     ]+vluxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c55207[     ]+vluxseg8ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c55227[     ]+vsuxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c55227[     ]+vsuxseg8ei16.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c55227[     ]+vsuxseg8ei16.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c56207[     ]+vluxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c56207[     ]+vluxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c56207[     ]+vluxseg2ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c56227[     ]+vsuxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c56227[     ]+vsuxseg2ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c56227[     ]+vsuxseg2ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c56207[     ]+vluxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c56207[     ]+vluxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c56207[     ]+vluxseg3ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c56227[     ]+vsuxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c56227[     ]+vsuxseg3ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c56227[     ]+vsuxseg3ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c56207[     ]+vluxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c56207[     ]+vluxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c56207[     ]+vluxseg4ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c56227[     ]+vsuxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c56227[     ]+vsuxseg4ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c56227[     ]+vsuxseg4ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c56207[     ]+vluxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c56207[     ]+vluxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c56207[     ]+vluxseg5ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c56227[     ]+vsuxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c56227[     ]+vsuxseg5ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c56227[     ]+vsuxseg5ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c56207[     ]+vluxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c56207[     ]+vluxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c56207[     ]+vluxseg6ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c56227[     ]+vsuxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c56227[     ]+vsuxseg6ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c56227[     ]+vsuxseg6ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c56207[     ]+vluxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c56207[     ]+vluxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c56207[     ]+vluxseg7ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c56227[     ]+vsuxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c56227[     ]+vsuxseg7ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c56227[     ]+vsuxseg7ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c56207[     ]+vluxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c56207[     ]+vluxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c56207[     ]+vluxseg8ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c56227[     ]+vsuxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c56227[     ]+vsuxseg8ei32.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c56227[     ]+vsuxseg8ei32.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c57207[     ]+vluxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c57207[     ]+vluxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c57207[     ]+vluxseg2ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+26c57227[     ]+vsuxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+26c57227[     ]+vsuxseg2ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+24c57227[     ]+vsuxseg2ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c57207[     ]+vluxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c57207[     ]+vluxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c57207[     ]+vluxseg3ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+46c57227[     ]+vsuxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+46c57227[     ]+vsuxseg3ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+44c57227[     ]+vsuxseg3ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c57207[     ]+vluxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c57207[     ]+vluxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c57207[     ]+vluxseg4ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66c57227[     ]+vsuxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+66c57227[     ]+vsuxseg4ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+64c57227[     ]+vsuxseg4ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c57207[     ]+vluxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c57207[     ]+vluxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c57207[     ]+vluxseg5ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+86c57227[     ]+vsuxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+86c57227[     ]+vsuxseg5ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+84c57227[     ]+vsuxseg5ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c57207[     ]+vluxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c57207[     ]+vluxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c57207[     ]+vluxseg6ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a6c57227[     ]+vsuxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a6c57227[     ]+vsuxseg6ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+a4c57227[     ]+vsuxseg6ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c57207[     ]+vluxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c57207[     ]+vluxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c57207[     ]+vluxseg7ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c6c57227[     ]+vsuxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c6c57227[     ]+vsuxseg7ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+c4c57227[     ]+vsuxseg7ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c57207[     ]+vluxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c57207[     ]+vluxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c57207[     ]+vluxseg8ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e6c57227[     ]+vsuxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e6c57227[     ]+vsuxseg8ei64.v[       ]+v4,\(a0\),v12
+[      ]+[0-9a-f]+:[   ]+e4c57227[     ]+vsuxseg8ei64.v[       ]+v4,\(a0\),v12,v0.t
+[      ]+[0-9a-f]+:[   ]+23050207[     ]+vlseg2e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+23050207[     ]+vlseg2e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+21050207[     ]+vlseg2e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+43050207[     ]+vlseg3e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+43050207[     ]+vlseg3e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+41050207[     ]+vlseg3e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+63050207[     ]+vlseg4e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+63050207[     ]+vlseg4e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+61050207[     ]+vlseg4e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+83050207[     ]+vlseg5e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+83050207[     ]+vlseg5e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+81050207[     ]+vlseg5e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a3050207[     ]+vlseg6e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a3050207[     ]+vlseg6e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a1050207[     ]+vlseg6e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c3050207[     ]+vlseg7e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c3050207[     ]+vlseg7e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c1050207[     ]+vlseg7e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e3050207[     ]+vlseg8e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e3050207[     ]+vlseg8e8ff.v[         ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e1050207[     ]+vlseg8e8ff.v[         ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+23055207[     ]+vlseg2e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+23055207[     ]+vlseg2e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+21055207[     ]+vlseg2e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+43055207[     ]+vlseg3e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+43055207[     ]+vlseg3e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+41055207[     ]+vlseg3e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+63055207[     ]+vlseg4e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+63055207[     ]+vlseg4e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+61055207[     ]+vlseg4e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+83055207[     ]+vlseg5e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+83055207[     ]+vlseg5e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+81055207[     ]+vlseg5e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a3055207[     ]+vlseg6e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a3055207[     ]+vlseg6e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a1055207[     ]+vlseg6e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c3055207[     ]+vlseg7e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c3055207[     ]+vlseg7e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c1055207[     ]+vlseg7e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e3055207[     ]+vlseg8e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e3055207[     ]+vlseg8e16ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e1055207[     ]+vlseg8e16ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+23056207[     ]+vlseg2e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+23056207[     ]+vlseg2e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+21056207[     ]+vlseg2e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+43056207[     ]+vlseg3e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+43056207[     ]+vlseg3e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+41056207[     ]+vlseg3e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+63056207[     ]+vlseg4e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+63056207[     ]+vlseg4e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+61056207[     ]+vlseg4e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+83056207[     ]+vlseg5e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+83056207[     ]+vlseg5e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+81056207[     ]+vlseg5e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a3056207[     ]+vlseg6e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a3056207[     ]+vlseg6e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a1056207[     ]+vlseg6e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c3056207[     ]+vlseg7e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c3056207[     ]+vlseg7e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c1056207[     ]+vlseg7e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e3056207[     ]+vlseg8e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e3056207[     ]+vlseg8e32ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e1056207[     ]+vlseg8e32ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+23057207[     ]+vlseg2e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+23057207[     ]+vlseg2e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+21057207[     ]+vlseg2e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+43057207[     ]+vlseg3e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+43057207[     ]+vlseg3e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+41057207[     ]+vlseg3e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+63057207[     ]+vlseg4e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+63057207[     ]+vlseg4e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+61057207[     ]+vlseg4e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+83057207[     ]+vlseg5e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+83057207[     ]+vlseg5e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+81057207[     ]+vlseg5e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+a3057207[     ]+vlseg6e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a3057207[     ]+vlseg6e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+a1057207[     ]+vlseg6e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+c3057207[     ]+vlseg7e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c3057207[     ]+vlseg7e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+c1057207[     ]+vlseg7e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+e3057207[     ]+vlseg8e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e3057207[     ]+vlseg8e64ff.v[        ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e1057207[     ]+vlseg8e64ff.v[        ]+v4,\(a0\),v0.t
+[      ]+[0-9a-f]+:[   ]+02850187[     ]+vl1r.v[       ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02850187[     ]+vl1r.v[       ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02850187[     ]+vl1r.v[       ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02850187[     ]+vl1r.v[       ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02855187[     ]+vl1re16.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02855187[     ]+vl1re16.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02856187[     ]+vl1re32.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02856187[     ]+vl1re32.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02857187[     ]+vl1re64.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+02857187[     ]+vl1re64.v[    ]+v3,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22850107[     ]+vl2r.v[       ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22850107[     ]+vl2r.v[       ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22850107[     ]+vl2r.v[       ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22850107[     ]+vl2r.v[       ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22855107[     ]+vl2re16.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22855107[     ]+vl2re16.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22856107[     ]+vl2re32.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22856107[     ]+vl2re32.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22857107[     ]+vl2re64.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+22857107[     ]+vl2re64.v[    ]+v2,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62850207[     ]+vl4r.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62850207[     ]+vl4r.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62850207[     ]+vl4r.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62850207[     ]+vl4r.v[       ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62855207[     ]+vl4re16.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62855207[     ]+vl4re16.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62856207[     ]+vl4re32.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62856207[     ]+vl4re32.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62857207[     ]+vl4re64.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+62857207[     ]+vl4re64.v[    ]+v4,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2850407[     ]+vl8r.v[       ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2850407[     ]+vl8r.v[       ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2850407[     ]+vl8r.v[       ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2850407[     ]+vl8r.v[       ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2855407[     ]+vl8re16.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2855407[     ]+vl8re16.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2856407[     ]+vl8re32.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2856407[     ]+vl8re32.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2857407[     ]+vl8re64.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+e2857407[     ]+vl8re64.v[    ]+v8,\(a0\)
+[      ]+[0-9a-f]+:[   ]+028581a7[     ]+vs1r.v[       ]+v3,\(a1\)
+[      ]+[0-9a-f]+:[   ]+028581a7[     ]+vs1r.v[       ]+v3,\(a1\)
+[      ]+[0-9a-f]+:[   ]+22858127[     ]+vs2r.v[       ]+v2,\(a1\)
+[      ]+[0-9a-f]+:[   ]+22858127[     ]+vs2r.v[       ]+v2,\(a1\)
+[      ]+[0-9a-f]+:[   ]+62858227[     ]+vs4r.v[       ]+v4,\(a1\)
+[      ]+[0-9a-f]+:[   ]+62858227[     ]+vs4r.v[       ]+v4,\(a1\)
+[      ]+[0-9a-f]+:[   ]+e2858427[     ]+vs8r.v[       ]+v8,\(a1\)
+[      ]+[0-9a-f]+:[   ]+e2858427[     ]+vs8r.v[       ]+v8,\(a1\)
+[      ]+[0-9a-f]+:[   ]+0685822f[     ]+vamoaddei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285822f[     ]+vamoaddei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485822f[     ]+vamoaddei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085822f[     ]+vamoaddei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85822f[     ]+vamoswapei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85822f[     ]+vamoswapei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85822f[     ]+vamoswapei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885822f[     ]+vamoswapei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685822f[     ]+vamoxorei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285822f[     ]+vamoxorei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485822f[     ]+vamoxorei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085822f[     ]+vamoxorei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685822f[     ]+vamoandei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285822f[     ]+vamoandei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485822f[     ]+vamoandei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085822f[     ]+vamoandei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685822f[     ]+vamoorei8.v[  ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285822f[     ]+vamoorei8.v[  ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485822f[     ]+vamoorei8.v[  ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085822f[     ]+vamoorei8.v[  ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685822f[     ]+vamominei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285822f[     ]+vamominei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485822f[     ]+vamominei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085822f[     ]+vamominei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685822f[     ]+vamomaxei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285822f[     ]+vamomaxei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485822f[     ]+vamomaxei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085822f[     ]+vamomaxei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685822f[     ]+vamominuei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285822f[     ]+vamominuei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485822f[     ]+vamominuei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085822f[     ]+vamominuei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685822f[     ]+vamomaxuei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285822f[     ]+vamomaxuei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485822f[     ]+vamomaxuei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085822f[     ]+vamomaxuei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685822f[     ]+vamoaddei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285822f[     ]+vamoaddei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485822f[     ]+vamoaddei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085822f[     ]+vamoaddei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85822f[     ]+vamoswapei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85822f[     ]+vamoswapei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85822f[     ]+vamoswapei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885822f[     ]+vamoswapei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685822f[     ]+vamoxorei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285822f[     ]+vamoxorei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485822f[     ]+vamoxorei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085822f[     ]+vamoxorei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685822f[     ]+vamoandei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285822f[     ]+vamoandei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485822f[     ]+vamoandei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085822f[     ]+vamoandei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685822f[     ]+vamoorei8.v[  ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285822f[     ]+vamoorei8.v[  ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485822f[     ]+vamoorei8.v[  ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085822f[     ]+vamoorei8.v[  ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685822f[     ]+vamominei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285822f[     ]+vamominei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485822f[     ]+vamominei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085822f[     ]+vamominei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685822f[     ]+vamomaxei8.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285822f[     ]+vamomaxei8.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485822f[     ]+vamomaxei8.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085822f[     ]+vamomaxei8.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685822f[     ]+vamominuei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285822f[     ]+vamominuei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485822f[     ]+vamominuei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085822f[     ]+vamominuei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685822f[     ]+vamomaxuei8.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285822f[     ]+vamomaxuei8.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485822f[     ]+vamomaxuei8.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085822f[     ]+vamomaxuei8.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685d22f[     ]+vamoaddei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285d22f[     ]+vamoaddei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485d22f[     ]+vamoaddei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085d22f[     ]+vamoaddei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85d22f[     ]+vamoswapei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85d22f[     ]+vamoswapei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85d22f[     ]+vamoswapei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885d22f[     ]+vamoswapei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685d22f[     ]+vamoxorei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285d22f[     ]+vamoxorei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485d22f[     ]+vamoxorei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085d22f[     ]+vamoxorei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685d22f[     ]+vamoandei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285d22f[     ]+vamoandei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485d22f[     ]+vamoandei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085d22f[     ]+vamoandei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685d22f[     ]+vamoorei16.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285d22f[     ]+vamoorei16.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485d22f[     ]+vamoorei16.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085d22f[     ]+vamoorei16.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685d22f[     ]+vamominei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285d22f[     ]+vamominei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485d22f[     ]+vamominei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085d22f[     ]+vamominei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685d22f[     ]+vamomaxei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285d22f[     ]+vamomaxei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485d22f[     ]+vamomaxei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085d22f[     ]+vamomaxei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685d22f[     ]+vamominuei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285d22f[     ]+vamominuei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485d22f[     ]+vamominuei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085d22f[     ]+vamominuei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685d22f[     ]+vamomaxuei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285d22f[     ]+vamomaxuei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485d22f[     ]+vamomaxuei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085d22f[     ]+vamomaxuei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685d22f[     ]+vamoaddei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285d22f[     ]+vamoaddei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485d22f[     ]+vamoaddei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085d22f[     ]+vamoaddei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85d22f[     ]+vamoswapei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85d22f[     ]+vamoswapei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85d22f[     ]+vamoswapei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885d22f[     ]+vamoswapei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685d22f[     ]+vamoxorei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285d22f[     ]+vamoxorei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485d22f[     ]+vamoxorei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085d22f[     ]+vamoxorei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685d22f[     ]+vamoandei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285d22f[     ]+vamoandei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485d22f[     ]+vamoandei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085d22f[     ]+vamoandei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685d22f[     ]+vamoorei16.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285d22f[     ]+vamoorei16.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485d22f[     ]+vamoorei16.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085d22f[     ]+vamoorei16.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685d22f[     ]+vamominei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285d22f[     ]+vamominei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485d22f[     ]+vamominei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085d22f[     ]+vamominei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685d22f[     ]+vamomaxei16.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285d22f[     ]+vamomaxei16.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485d22f[     ]+vamomaxei16.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085d22f[     ]+vamomaxei16.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685d22f[     ]+vamominuei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285d22f[     ]+vamominuei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485d22f[     ]+vamominuei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085d22f[     ]+vamominuei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685d22f[     ]+vamomaxuei16.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285d22f[     ]+vamomaxuei16.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485d22f[     ]+vamomaxuei16.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085d22f[     ]+vamomaxuei16.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685e22f[     ]+vamoaddei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285e22f[     ]+vamoaddei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485e22f[     ]+vamoaddei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085e22f[     ]+vamoaddei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85e22f[     ]+vamoswapei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85e22f[     ]+vamoswapei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85e22f[     ]+vamoswapei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885e22f[     ]+vamoswapei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685e22f[     ]+vamoxorei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285e22f[     ]+vamoxorei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485e22f[     ]+vamoxorei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085e22f[     ]+vamoxorei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685e22f[     ]+vamoandei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285e22f[     ]+vamoandei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485e22f[     ]+vamoandei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085e22f[     ]+vamoandei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685e22f[     ]+vamoorei32.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285e22f[     ]+vamoorei32.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485e22f[     ]+vamoorei32.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085e22f[     ]+vamoorei32.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685e22f[     ]+vamominei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285e22f[     ]+vamominei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485e22f[     ]+vamominei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085e22f[     ]+vamominei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685e22f[     ]+vamomaxei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285e22f[     ]+vamomaxei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485e22f[     ]+vamomaxei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085e22f[     ]+vamomaxei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685e22f[     ]+vamominuei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285e22f[     ]+vamominuei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485e22f[     ]+vamominuei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085e22f[     ]+vamominuei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685e22f[     ]+vamomaxuei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285e22f[     ]+vamomaxuei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485e22f[     ]+vamomaxuei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085e22f[     ]+vamomaxuei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685e22f[     ]+vamoaddei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285e22f[     ]+vamoaddei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485e22f[     ]+vamoaddei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085e22f[     ]+vamoaddei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85e22f[     ]+vamoswapei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85e22f[     ]+vamoswapei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85e22f[     ]+vamoswapei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885e22f[     ]+vamoswapei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685e22f[     ]+vamoxorei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285e22f[     ]+vamoxorei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485e22f[     ]+vamoxorei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085e22f[     ]+vamoxorei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685e22f[     ]+vamoandei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285e22f[     ]+vamoandei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485e22f[     ]+vamoandei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085e22f[     ]+vamoandei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685e22f[     ]+vamoorei32.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285e22f[     ]+vamoorei32.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485e22f[     ]+vamoorei32.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085e22f[     ]+vamoorei32.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685e22f[     ]+vamominei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285e22f[     ]+vamominei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485e22f[     ]+vamominei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085e22f[     ]+vamominei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685e22f[     ]+vamomaxei32.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285e22f[     ]+vamomaxei32.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485e22f[     ]+vamomaxei32.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085e22f[     ]+vamomaxei32.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685e22f[     ]+vamominuei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285e22f[     ]+vamominuei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485e22f[     ]+vamominuei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085e22f[     ]+vamominuei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685e22f[     ]+vamomaxuei32.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285e22f[     ]+vamomaxuei32.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485e22f[     ]+vamomaxuei32.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085e22f[     ]+vamomaxuei32.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685f22f[     ]+vamoaddei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285f22f[     ]+vamoaddei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485f22f[     ]+vamoaddei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085f22f[     ]+vamoaddei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85f22f[     ]+vamoswapei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85f22f[     ]+vamoswapei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85f22f[     ]+vamoswapei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885f22f[     ]+vamoswapei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685f22f[     ]+vamoxorei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285f22f[     ]+vamoxorei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485f22f[     ]+vamoxorei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085f22f[     ]+vamoxorei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685f22f[     ]+vamoandei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285f22f[     ]+vamoandei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485f22f[     ]+vamoandei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085f22f[     ]+vamoandei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685f22f[     ]+vamoorei64.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285f22f[     ]+vamoorei64.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485f22f[     ]+vamoorei64.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085f22f[     ]+vamoorei64.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685f22f[     ]+vamominei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285f22f[     ]+vamominei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485f22f[     ]+vamominei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085f22f[     ]+vamominei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685f22f[     ]+vamomaxei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285f22f[     ]+vamomaxei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485f22f[     ]+vamomaxei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085f22f[     ]+vamomaxei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685f22f[     ]+vamominuei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285f22f[     ]+vamominuei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485f22f[     ]+vamominuei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085f22f[     ]+vamominuei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685f22f[     ]+vamomaxuei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285f22f[     ]+vamomaxuei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485f22f[     ]+vamomaxuei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085f22f[     ]+vamomaxuei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0685f22f[     ]+vamoaddei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0285f22f[     ]+vamoaddei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0485f22f[     ]+vamoaddei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0085f22f[     ]+vamoaddei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e85f22f[     ]+vamoswapei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0a85f22f[     ]+vamoswapei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+0c85f22f[     ]+vamoswapei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0885f22f[     ]+vamoswapei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2685f22f[     ]+vamoxorei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2285f22f[     ]+vamoxorei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+2485f22f[     ]+vamoxorei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+2085f22f[     ]+vamoxorei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6685f22f[     ]+vamoandei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6285f22f[     ]+vamoandei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+6485f22f[     ]+vamoandei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+6085f22f[     ]+vamoandei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4685f22f[     ]+vamoorei64.v[         ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4285f22f[     ]+vamoorei64.v[         ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+4485f22f[     ]+vamoorei64.v[         ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+4085f22f[     ]+vamoorei64.v[         ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8685f22f[     ]+vamominei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8285f22f[     ]+vamominei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+8485f22f[     ]+vamominei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+8085f22f[     ]+vamominei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a685f22f[     ]+vamomaxei64.v[        ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a285f22f[     ]+vamomaxei64.v[        ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+a485f22f[     ]+vamomaxei64.v[        ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+a085f22f[     ]+vamomaxei64.v[        ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c685f22f[     ]+vamominuei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c285f22f[     ]+vamominuei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+c485f22f[     ]+vamominuei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+c085f22f[     ]+vamominuei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e685f22f[     ]+vamomaxuei64.v[       ]+v4,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e285f22f[     ]+vamomaxuei64.v[       ]+zero,\(a1\),v8,v4
+[      ]+[0-9a-f]+:[   ]+e485f22f[     ]+vamomaxuei64.v[       ]+v4,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+e085f22f[     ]+vamomaxuei64.v[       ]+zero,\(a1\),v8,v4,v0.t
+[      ]+[0-9a-f]+:[   ]+0e804257[     ]+vneg.v[       ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+0c804257[     ]+vneg.v[       ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+02860257[     ]+vadd.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0285c257[     ]+vadd.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+0287b257[     ]+vadd.vi[      ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+02883257[     ]+vadd.vi[      ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+00860257[     ]+vadd.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0085c257[     ]+vadd.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0087b257[     ]+vadd.vi[      ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+00883257[     ]+vadd.vi[      ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+0a860257[     ]+vsub.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0a85c257[     ]+vsub.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+0e85c257[     ]+vrsub.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+0e87b257[     ]+vrsub.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+0e883257[     ]+vrsub.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+08860257[     ]+vsub.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0885c257[     ]+vsub.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0c85c257[     ]+vrsub.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+0c87b257[     ]+vrsub.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+0c883257[     ]+vrsub.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+c6806257[     ]+vwcvt.x.x.v[  ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+c2806257[     ]+vwcvtu.x.x.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+c4806257[     ]+vwcvt.x.x.v[  ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+c0806257[     ]+vwcvtu.x.x.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+c2862257[     ]+vwaddu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c285e257[     ]+vwaddu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+c0862257[     ]+vwaddu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c085e257[     ]+vwaddu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+ca862257[     ]+vwsubu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ca85e257[     ]+vwsubu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+c8862257[     ]+vwsubu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c885e257[     ]+vwsubu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+c6862257[     ]+vwadd.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c685e257[     ]+vwadd.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+c4862257[     ]+vwadd.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c485e257[     ]+vwadd.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+ce862257[     ]+vwsub.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ce85e257[     ]+vwsub.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+cc862257[     ]+vwsub.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+cc85e257[     ]+vwsub.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+d2862257[     ]+vwaddu.wv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+d285e257[     ]+vwaddu.wx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+d0862257[     ]+vwaddu.wv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+d085e257[     ]+vwaddu.wx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+da862257[     ]+vwsubu.wv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+da85e257[     ]+vwsubu.wx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+d8862257[     ]+vwsubu.wv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+d885e257[     ]+vwsubu.wx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+d6862257[     ]+vwadd.wv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+d685e257[     ]+vwadd.wx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+d4862257[     ]+vwadd.wv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+d485e257[     ]+vwadd.wx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+de862257[     ]+vwsub.wv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+de85e257[     ]+vwsub.wx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+dc862257[     ]+vwsub.wv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+dc85e257[     ]+vwsub.wx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+4a832257[     ]+vzext.vf2[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48832257[     ]+vzext.vf2[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a83a257[     ]+vsext.vf2[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4883a257[     ]+vsext.vf2[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a822257[     ]+vzext.vf4[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48822257[     ]+vzext.vf4[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a82a257[     ]+vsext.vf4[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4882a257[     ]+vsext.vf4[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a812257[     ]+vzext.vf8[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48812257[     ]+vzext.vf8[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a81a257[     ]+vsext.vf8[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4881a257[     ]+vsext.vf8[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+40860257[     ]+vadc.vvm[     ]+v4,v8,v12,v0
+[      ]+[0-9a-f]+:[   ]+4085c257[     ]+vadc.vxm[     ]+v4,v8,a1,v0
+[      ]+[0-9a-f]+:[   ]+4087b257[     ]+vadc.vim[     ]+v4,v8,15,v0
+[      ]+[0-9a-f]+:[   ]+40883257[     ]+vadc.vim[     ]+v4,v8,-16,v0
+[      ]+[0-9a-f]+:[   ]+44860257[     ]+vmadc.vvm[    ]+v4,v8,v12,v0
+[      ]+[0-9a-f]+:[   ]+4485c257[     ]+vmadc.vxm[    ]+v4,v8,a1,v0
+[      ]+[0-9a-f]+:[   ]+4487b257[     ]+vmadc.vim[    ]+v4,v8,15,v0
+[      ]+[0-9a-f]+:[   ]+44883257[     ]+vmadc.vim[    ]+v4,v8,-16,v0
+[      ]+[0-9a-f]+:[   ]+46860257[     ]+vmadc.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+4685c257[     ]+vmadc.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+4687b257[     ]+vmadc.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+46883257[     ]+vmadc.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+48860257[     ]+vsbc.vvm[     ]+v4,v8,v12,v0
+[      ]+[0-9a-f]+:[   ]+4885c257[     ]+vsbc.vxm[     ]+v4,v8,a1,v0
+[      ]+[0-9a-f]+:[   ]+4c860257[     ]+vmsbc.vvm[    ]+v4,v8,v12,v0
+[      ]+[0-9a-f]+:[   ]+4c85c257[     ]+vmsbc.vxm[    ]+v4,v8,a1,v0
+[      ]+[0-9a-f]+:[   ]+4e860257[     ]+vmsbc.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+4e85c257[     ]+vmsbc.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+2e8fb257[     ]+vnot.v[       ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+2c8fb257[     ]+vnot.v[       ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+26860257[     ]+vand.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2685c257[     ]+vand.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+2687b257[     ]+vand.vi[      ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+26883257[     ]+vand.vi[      ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+24860257[     ]+vand.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2485c257[     ]+vand.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2487b257[     ]+vand.vi[      ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+24883257[     ]+vand.vi[      ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+2a860257[     ]+vor.vv[       ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2a85c257[     ]+vor.vx[       ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+2a87b257[     ]+vor.vi[       ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+2a883257[     ]+vor.vi[       ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+28860257[     ]+vor.vv[       ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2885c257[     ]+vor.vx[       ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2887b257[     ]+vor.vi[       ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+28883257[     ]+vor.vi[       ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+2e860257[     ]+vxor.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2e85c257[     ]+vxor.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+2e87b257[     ]+vxor.vi[      ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+2e883257[     ]+vxor.vi[      ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+2c860257[     ]+vxor.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2c85c257[     ]+vxor.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2c87b257[     ]+vxor.vi[      ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+2c883257[     ]+vxor.vi[      ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+96860257[     ]+vsll.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9685c257[     ]+vsll.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+9680b257[     ]+vsll.vi[      ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+968fb257[     ]+vsll.vi[      ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+94860257[     ]+vsll.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9485c257[     ]+vsll.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+9480b257[     ]+vsll.vi[      ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+948fb257[     ]+vsll.vi[      ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+a2860257[     ]+vsrl.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+a285c257[     ]+vsrl.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+a280b257[     ]+vsrl.vi[      ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+a28fb257[     ]+vsrl.vi[      ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+a0860257[     ]+vsrl.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a085c257[     ]+vsrl.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+a080b257[     ]+vsrl.vi[      ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+a08fb257[     ]+vsrl.vi[      ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+a6860257[     ]+vsra.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+a685c257[     ]+vsra.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+a680b257[     ]+vsra.vi[      ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+a68fb257[     ]+vsra.vi[      ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+a4860257[     ]+vsra.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a485c257[     ]+vsra.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+a480b257[     ]+vsra.vi[      ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+a48fb257[     ]+vsra.vi[      ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+b2804257[     ]+vncvt.x.x.w[  ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+b0804257[     ]+vncvt.x.x.w[  ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b2860257[     ]+vnsrl.wv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+b285c257[     ]+vnsrl.wx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+b280b257[     ]+vnsrl.wi[     ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+b28fb257[     ]+vnsrl.wi[     ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+b0860257[     ]+vnsrl.wv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+b085c257[     ]+vnsrl.wx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+b080b257[     ]+vnsrl.wi[     ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+b08fb257[     ]+vnsrl.wi[     ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+b6860257[     ]+vnsra.wv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+b685c257[     ]+vnsra.wx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+b680b257[     ]+vnsra.wi[     ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+b68fb257[     ]+vnsra.wi[     ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+b4860257[     ]+vnsra.wv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+b485c257[     ]+vnsra.wx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+b480b257[     ]+vnsra.wi[     ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+b48fb257[     ]+vnsra.wi[     ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec40257[     ]+vmslt.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+6ac40257[     ]+vmsltu.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+76c40257[     ]+vmsle.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+72c40257[     ]+vmsleu.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+6cc40257[     ]+vmslt.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+68c40257[     ]+vmsltu.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+74c40257[     ]+vmsle.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+70c40257[     ]+vmsleu.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+7687b257[     ]+vmsle.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+76883257[     ]+vmsle.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7287b257[     ]+vmsleu.vi[    ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+72883257[     ]+vmsleu.vi[    ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7e87b257[     ]+vmsgt.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+7e883257[     ]+vmsgt.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7a87b257[     ]+vmsgtu.vi[    ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+7a883257[     ]+vmsgtu.vi[    ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7487b257[     ]+vmsle.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+74883257[     ]+vmsle.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+7087b257[     ]+vmsleu.vi[    ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+70883257[     ]+vmsleu.vi[    ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+7c87b257[     ]+vmsgt.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+7c883257[     ]+vmsgt.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+7887b257[     ]+vmsgtu.vi[    ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+78883257[     ]+vmsgtu.vi[    ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+62860257[     ]+vmseq.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6285c257[     ]+vmseq.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+6287b257[     ]+vmseq.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+62883257[     ]+vmseq.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+60860257[     ]+vmseq.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6085c257[     ]+vmseq.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6087b257[     ]+vmseq.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+60883257[     ]+vmseq.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+66860257[     ]+vmsne.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6685c257[     ]+vmsne.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+6687b257[     ]+vmsne.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+66883257[     ]+vmsne.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+64860257[     ]+vmsne.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6485c257[     ]+vmsne.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6487b257[     ]+vmsne.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+64883257[     ]+vmsne.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+6a860257[     ]+vmsltu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6a85c257[     ]+vmsltu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+68860257[     ]+vmsltu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6885c257[     ]+vmsltu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+6e860257[     ]+vmslt.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6e85c257[     ]+vmslt.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+6c860257[     ]+vmslt.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6c85c257[     ]+vmslt.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+72860257[     ]+vmsleu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+7285c257[     ]+vmsleu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+7287b257[     ]+vmsleu.vi[    ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+72883257[     ]+vmsleu.vi[    ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+70860257[     ]+vmsleu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+7085c257[     ]+vmsleu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+7087b257[     ]+vmsleu.vi[    ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+70883257[     ]+vmsleu.vi[    ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+76860257[     ]+vmsle.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+7685c257[     ]+vmsle.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+7687b257[     ]+vmsle.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+76883257[     ]+vmsle.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+74860257[     ]+vmsle.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+7485c257[     ]+vmsle.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+7487b257[     ]+vmsle.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+74883257[     ]+vmsle.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+7a85c257[     ]+vmsgtu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+7a87b257[     ]+vmsgtu.vi[    ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+7a883257[     ]+vmsgtu.vi[    ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7885c257[     ]+vmsgtu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+7887b257[     ]+vmsgtu.vi[    ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+78883257[     ]+vmsgtu.vi[    ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+7e85c257[     ]+vmsgt.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+7e87b257[     ]+vmsgt.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+7e883257[     ]+vmsgt.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+7c85c257[     ]+vmsgt.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+7c87b257[     ]+vmsgt.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+7c883257[     ]+vmsgt.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+12860257[     ]+vminu.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1285c257[     ]+vminu.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+10860257[     ]+vminu.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+1085c257[     ]+vminu.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+16860257[     ]+vmin.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1685c257[     ]+vmin.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+14860257[     ]+vmin.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+1485c257[     ]+vmin.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+1a860257[     ]+vmaxu.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1a85c257[     ]+vmaxu.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+18860257[     ]+vmaxu.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+1885c257[     ]+vmaxu.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+1e860257[     ]+vmax.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1e85c257[     ]+vmax.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+1c860257[     ]+vmax.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+1c85c257[     ]+vmax.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+96862257[     ]+vmul.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9685e257[     ]+vmul.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+94862257[     ]+vmul.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9485e257[     ]+vmul.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+9e862257[     ]+vmulh.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9e85e257[     ]+vmulh.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+9c862257[     ]+vmulh.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9c85e257[     ]+vmulh.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+92862257[     ]+vmulhu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9285e257[     ]+vmulhu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+90862257[     ]+vmulhu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9085e257[     ]+vmulhu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+9a862257[     ]+vmulhsu.vv[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9a85e257[     ]+vmulhsu.vx[   ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+98862257[     ]+vmulhsu.vv[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9885e257[     ]+vmulhsu.vx[   ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+ee862257[     ]+vwmul.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ee85e257[     ]+vwmul.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+ec862257[     ]+vwmul.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+ec85e257[     ]+vwmul.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+e2862257[     ]+vwmulu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+e285e257[     ]+vwmulu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+e0862257[     ]+vwmulu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e085e257[     ]+vwmulu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+ea862257[     ]+vwmulsu.vv[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ea85e257[     ]+vwmulsu.vx[   ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+e8862257[     ]+vwmulsu.vv[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e885e257[     ]+vwmulsu.vx[   ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+b6862257[     ]+vmacc.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+b685e257[     ]+vmacc.vx[     ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+b4862257[     ]+vmacc.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b485e257[     ]+vmacc.vx[     ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+be862257[     ]+vnmsac.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+be85e257[     ]+vnmsac.vx[    ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+bc862257[     ]+vnmsac.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+bc85e257[     ]+vnmsac.vx[    ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a6862257[     ]+vmadd.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+a685e257[     ]+vmadd.vx[     ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+a4862257[     ]+vmadd.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a485e257[     ]+vmadd.vx[     ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+ae862257[     ]+vnmsub.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+ae85e257[     ]+vnmsub.vx[    ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+ac862257[     ]+vnmsub.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+ac85e257[     ]+vnmsub.vx[    ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f2862257[     ]+vwmaccu.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+f285e257[     ]+vwmaccu.vx[   ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+f0862257[     ]+vwmaccu.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f085e257[     ]+vwmaccu.vx[   ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f6862257[     ]+vwmacc.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+f685e257[     ]+vwmacc.vx[    ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+f4862257[     ]+vwmacc.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f485e257[     ]+vwmacc.vx[    ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+fe862257[     ]+vwmaccsu.vv[  ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+fe85e257[     ]+vwmaccsu.vx[  ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+fc862257[     ]+vwmaccsu.vv[  ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+fc85e257[     ]+vwmaccsu.vx[  ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+fa85e257[     ]+vwmaccus.vx[  ]+v4,a1,v8
+[      ]+[0-9a-f]+:[   ]+f885e257[     ]+vwmaccus.vx[  ]+v4,a1,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+82862257[     ]+vdivu.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8285e257[     ]+vdivu.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+80862257[     ]+vdivu.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8085e257[     ]+vdivu.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+86862257[     ]+vdiv.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8685e257[     ]+vdiv.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+84862257[     ]+vdiv.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8485e257[     ]+vdiv.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8a862257[     ]+vremu.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8a85e257[     ]+vremu.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+88862257[     ]+vremu.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8885e257[     ]+vremu.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8e862257[     ]+vrem.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8e85e257[     ]+vrem.vx[      ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+8c862257[     ]+vrem.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8c85e257[     ]+vrem.vx[      ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+5c860257[     ]+vmerge.vvm[   ]+v4,v8,v12,v0
+[      ]+[0-9a-f]+:[   ]+5c85c257[     ]+vmerge.vxm[   ]+v4,v8,a1,v0
+[      ]+[0-9a-f]+:[   ]+5c87b257[     ]+vmerge.vim[   ]+v4,v8,15,v0
+[      ]+[0-9a-f]+:[   ]+5c883257[     ]+vmerge.vim[   ]+v4,v8,-16,v0
+[      ]+[0-9a-f]+:[   ]+5e060457[     ]+vmv.v.v[      ]+v8,v12
+[      ]+[0-9a-f]+:[   ]+5e05c457[     ]+vmv.v.x[      ]+v8,a1
+[      ]+[0-9a-f]+:[   ]+5e07b457[     ]+vmv.v.i[      ]+v8,15
+[      ]+[0-9a-f]+:[   ]+5e083457[     ]+vmv.v.i[      ]+v8,-16
+[      ]+[0-9a-f]+:[   ]+82860257[     ]+vsaddu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8285c257[     ]+vsaddu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+8287b257[     ]+vsaddu.vi[    ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+82883257[     ]+vsaddu.vi[    ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+80860257[     ]+vsaddu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8085c257[     ]+vsaddu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8087b257[     ]+vsaddu.vi[    ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+80883257[     ]+vsaddu.vi[    ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+86860257[     ]+vsadd.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8685c257[     ]+vsadd.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+8687b257[     ]+vsadd.vi[     ]+v4,v8,15
+[      ]+[0-9a-f]+:[   ]+86883257[     ]+vsadd.vi[     ]+v4,v8,-16
+[      ]+[0-9a-f]+:[   ]+84860257[     ]+vsadd.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8485c257[     ]+vsadd.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8487b257[     ]+vsadd.vi[     ]+v4,v8,15,v0.t
+[      ]+[0-9a-f]+:[   ]+84883257[     ]+vsadd.vi[     ]+v4,v8,-16,v0.t
+[      ]+[0-9a-f]+:[   ]+8a860257[     ]+vssubu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8a85c257[     ]+vssubu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+88860257[     ]+vssubu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8885c257[     ]+vssubu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+8e860257[     ]+vssub.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+8e85c257[     ]+vssub.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+8c860257[     ]+vssub.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+8c85c257[     ]+vssub.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+22862257[     ]+vaaddu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2285e257[     ]+vaaddu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+20862257[     ]+vaaddu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2085e257[     ]+vaaddu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+26862257[     ]+vaadd.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2685e257[     ]+vaadd.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+24862257[     ]+vaadd.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2485e257[     ]+vaadd.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2a862257[     ]+vasubu.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2a85e257[     ]+vasubu.vx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+28862257[     ]+vasubu.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2885e257[     ]+vasubu.vx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+2e862257[     ]+vasub.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2e85e257[     ]+vasub.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+2c862257[     ]+vasub.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+2c85e257[     ]+vasub.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+9e860257[     ]+vsmul.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9e85c257[     ]+vsmul.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+9c860257[     ]+vsmul.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+9c85c257[     ]+vsmul.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+aa860257[     ]+vssrl.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+aa85c257[     ]+vssrl.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+aa80b257[     ]+vssrl.vi[     ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+aa8fb257[     ]+vssrl.vi[     ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+a8860257[     ]+vssrl.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+a885c257[     ]+vssrl.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+a880b257[     ]+vssrl.vi[     ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+a88fb257[     ]+vssrl.vi[     ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+ae860257[     ]+vssra.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ae85c257[     ]+vssra.vx[     ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+ae80b257[     ]+vssra.vi[     ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+ae8fb257[     ]+vssra.vi[     ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+ac860257[     ]+vssra.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+ac85c257[     ]+vssra.vx[     ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+ac80b257[     ]+vssra.vi[     ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+ac8fb257[     ]+vssra.vi[     ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+ba860257[     ]+vnclipu.wv[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ba85c257[     ]+vnclipu.wx[   ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+ba80b257[     ]+vnclipu.wi[   ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+ba8fb257[     ]+vnclipu.wi[   ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+b8860257[     ]+vnclipu.wv[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+b885c257[     ]+vnclipu.wx[   ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+b880b257[     ]+vnclipu.wi[   ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+b88fb257[     ]+vnclipu.wi[   ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+be860257[     ]+vnclip.wv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+be85c257[     ]+vnclip.wx[    ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+be80b257[     ]+vnclip.wi[    ]+v4,v8,1
+[      ]+[0-9a-f]+:[   ]+be8fb257[     ]+vnclip.wi[    ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+bc860257[     ]+vnclip.wv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+bc85c257[     ]+vnclip.wx[    ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+bc80b257[     ]+vnclip.wi[    ]+v4,v8,1,v0.t
+[      ]+[0-9a-f]+:[   ]+bc8fb257[     ]+vnclip.wi[    ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+02861257[     ]+vfadd.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+02865257[     ]+vfadd.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+00861257[     ]+vfadd.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+00865257[     ]+vfadd.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+0a861257[     ]+vfsub.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0a865257[     ]+vfsub.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+08861257[     ]+vfsub.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+08865257[     ]+vfsub.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+9e865257[     ]+vfrsub.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+9c865257[     ]+vfrsub.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+c2861257[     ]+vfwadd.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c2865257[     ]+vfwadd.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+c0861257[     ]+vfwadd.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c0865257[     ]+vfwadd.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+ca861257[     ]+vfwsub.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+ca865257[     ]+vfwsub.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+c8861257[     ]+vfwsub.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c8865257[     ]+vfwsub.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+d2861257[     ]+vfwadd.wv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+d2865257[     ]+vfwadd.wf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+d0861257[     ]+vfwadd.wv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+d0865257[     ]+vfwadd.wf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+da861257[     ]+vfwsub.wv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+da865257[     ]+vfwsub.wf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+d8861257[     ]+vfwsub.wv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+d8865257[     ]+vfwsub.wf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+92861257[     ]+vfmul.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+92865257[     ]+vfmul.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+90861257[     ]+vfmul.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+90865257[     ]+vfmul.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+82861257[     ]+vfdiv.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+82865257[     ]+vfdiv.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+80861257[     ]+vfdiv.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+80865257[     ]+vfdiv.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+86865257[     ]+vfrdiv.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+84865257[     ]+vfrdiv.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+e2861257[     ]+vfwmul.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+e2865257[     ]+vfwmul.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+e0861257[     ]+vfwmul.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+e0865257[     ]+vfwmul.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+a2861257[     ]+vfmadd.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+a2865257[     ]+vfmadd.vf[    ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+a6861257[     ]+vfnmadd.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+a6865257[     ]+vfnmadd.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+aa861257[     ]+vfmsub.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+aa865257[     ]+vfmsub.vf[    ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+ae861257[     ]+vfnmsub.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+ae865257[     ]+vfnmsub.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+a0861257[     ]+vfmadd.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a0865257[     ]+vfmadd.vf[    ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a4861257[     ]+vfnmadd.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a4865257[     ]+vfnmadd.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a8861257[     ]+vfmsub.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+a8865257[     ]+vfmsub.vf[    ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+ac861257[     ]+vfnmsub.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+ac865257[     ]+vfnmsub.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b2861257[     ]+vfmacc.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+b2865257[     ]+vfmacc.vf[    ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+b6861257[     ]+vfnmacc.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+b6865257[     ]+vfnmacc.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+ba861257[     ]+vfmsac.vv[    ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+ba865257[     ]+vfmsac.vf[    ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+be861257[     ]+vfnmsac.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+be865257[     ]+vfnmsac.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+b0861257[     ]+vfmacc.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b0865257[     ]+vfmacc.vf[    ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b4861257[     ]+vfnmacc.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b4865257[     ]+vfnmacc.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b8861257[     ]+vfmsac.vv[    ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+b8865257[     ]+vfmsac.vf[    ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+bc861257[     ]+vfnmsac.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+bc865257[     ]+vfnmsac.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f2861257[     ]+vfwmacc.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+f2865257[     ]+vfwmacc.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+f6861257[     ]+vfwnmacc.vv[  ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+f6865257[     ]+vfwnmacc.vf[  ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+fa861257[     ]+vfwmsac.vv[   ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+fa865257[     ]+vfwmsac.vf[   ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+fe861257[     ]+vfwnmsac.vv[  ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+fe865257[     ]+vfwnmsac.vf[  ]+v4,fa2,v8
+[      ]+[0-9a-f]+:[   ]+f0861257[     ]+vfwmacc.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f0865257[     ]+vfwmacc.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f4861257[     ]+vfwnmacc.vv[  ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f4865257[     ]+vfwnmacc.vf[  ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f8861257[     ]+vfwmsac.vv[   ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+f8865257[     ]+vfwmsac.vf[   ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+fc861257[     ]+vfwnmsac.vv[  ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+fc865257[     ]+vfwnmsac.vf[  ]+v4,fa2,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e801257[     ]+vfsqrt.v[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c801257[     ]+vfsqrt.v[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e821257[     ]+vfrsqrt7.v[   ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c821257[     ]+vfrsqrt7.v[   ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e821257[     ]+vfrsqrt7.v[   ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c821257[     ]+vfrsqrt7.v[   ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e829257[     ]+vfrec7.v[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c829257[     ]+vfrec7.v[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e829257[     ]+vfrec7.v[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c829257[     ]+vfrec7.v[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4e881257[     ]+vfclass.v[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4c881257[     ]+vfclass.v[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+12861257[     ]+vfmin.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+12865257[     ]+vfmin.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+1a861257[     ]+vfmax.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1a865257[     ]+vfmax.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+10861257[     ]+vfmin.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+10865257[     ]+vfmin.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+18861257[     ]+vfmax.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+18865257[     ]+vfmax.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+26841257[     ]+vfneg.v[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+24841257[     ]+vfneg.v[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+22861257[     ]+vfsgnj.vv[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+22865257[     ]+vfsgnj.vf[    ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+26861257[     ]+vfsgnjn.vv[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+26865257[     ]+vfsgnjn.vf[   ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+2a861257[     ]+vfsgnjx.vv[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+2a865257[     ]+vfsgnjx.vf[   ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+20861257[     ]+vfsgnj.vv[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+20865257[     ]+vfsgnj.vf[    ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+24861257[     ]+vfsgnjn.vv[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+24865257[     ]+vfsgnjn.vf[   ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+28861257[     ]+vfsgnjx.vv[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+28865257[     ]+vfsgnjx.vf[   ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+6ec41257[     ]+vmflt.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+66c41257[     ]+vmfle.vv[     ]+v4,v12,v8
+[      ]+[0-9a-f]+:[   ]+6cc41257[     ]+vmflt.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+64c41257[     ]+vmfle.vv[     ]+v4,v12,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+62861257[     ]+vmfeq.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+62865257[     ]+vmfeq.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+72861257[     ]+vmfne.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+72865257[     ]+vmfne.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+6e861257[     ]+vmflt.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6e865257[     ]+vmflt.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+66861257[     ]+vmfle.vv[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+66865257[     ]+vmfle.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+76865257[     ]+vmfgt.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+7e865257[     ]+vmfge.vf[     ]+v4,v8,fa2
+[      ]+[0-9a-f]+:[   ]+60861257[     ]+vmfeq.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+60865257[     ]+vmfeq.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+70861257[     ]+vmfne.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+70865257[     ]+vmfne.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+6c861257[     ]+vmflt.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+6c865257[     ]+vmflt.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+64861257[     ]+vmfle.vv[     ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+64865257[     ]+vmfle.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+74865257[     ]+vmfgt.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+7c865257[     ]+vmfge.vf[     ]+v4,v8,fa2,v0.t
+[      ]+[0-9a-f]+:[   ]+5c865257[     ]+vfmerge.vfm[  ]+v4,v8,fa2,v0
+[      ]+[0-9a-f]+:[   ]+5e05d257[     ]+vfmv.v.f[     ]+v4,fa1
+[      ]+[0-9a-f]+:[   ]+4a801257[     ]+vfcvt.xu.f.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a809257[     ]+vfcvt.x.f.v[  ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a831257[     ]+vfcvt.rtz.xu.f.v[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a839257[     ]+vfcvt.rtz.x.f.v[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a811257[     ]+vfcvt.f.xu.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a819257[     ]+vfcvt.f.x.v[  ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48801257[     ]+vfcvt.xu.f.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48809257[     ]+vfcvt.x.f.v[  ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48831257[     ]+vfcvt.rtz.xu.f.v[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48839257[     ]+vfcvt.rtz.x.f.v[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48811257[     ]+vfcvt.f.xu.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48819257[     ]+vfcvt.f.x.v[  ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a841257[     ]+vfwcvt.xu.f.v[        ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a849257[     ]+vfwcvt.x.f.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a871257[     ]+vfwcvt.rtz.xu.f.v[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a879257[     ]+vfwcvt.rtz.x.f.v[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a851257[     ]+vfwcvt.f.xu.v[        ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a859257[     ]+vfwcvt.f.x.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a861257[     ]+vfwcvt.f.f.v[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48841257[     ]+vfwcvt.xu.f.v[        ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48849257[     ]+vfwcvt.x.f.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48871257[     ]+vfwcvt.rtz.xu.f.v[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48879257[     ]+vfwcvt.rtz.x.f.v[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48851257[     ]+vfwcvt.f.xu.v[        ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48859257[     ]+vfwcvt.f.x.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48861257[     ]+vfwcvt.f.f.v[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+4a881257[     ]+vfncvt.xu.f.w[        ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a889257[     ]+vfncvt.x.f.w[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a8b1257[     ]+vfncvt.rtz.xu.f.w[    ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a8b9257[     ]+vfncvt.rtz.x.f.w[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a891257[     ]+vfncvt.f.xu.w[        ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a899257[     ]+vfncvt.f.x.w[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a8a1257[     ]+vfncvt.f.f.w[         ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+4a8a9257[     ]+vfncvt.rod.f.f.w[     ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+48881257[     ]+vfncvt.xu.f.w[        ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48889257[     ]+vfncvt.x.f.w[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+488b1257[     ]+vfncvt.rtz.xu.f.w[    ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+488b9257[     ]+vfncvt.rtz.x.f.w[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48891257[     ]+vfncvt.f.xu.w[        ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+48899257[     ]+vfncvt.f.x.w[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+488a1257[     ]+vfncvt.f.f.w[         ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+488a9257[     ]+vfncvt.rod.f.f.w[     ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+02862257[     ]+vredsum.vs[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1a842257[     ]+vredmaxu.vs[  ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+1e842257[     ]+vredmax.vs[   ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+12842257[     ]+vredminu.vs[  ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+16842257[     ]+vredmin.vs[   ]+v4,v8,v8
+[      ]+[0-9a-f]+:[   ]+06862257[     ]+vredand.vs[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0a862257[     ]+vredor.vs[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0e862257[     ]+vredxor.vs[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+00862257[     ]+vredsum.vs[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+18842257[     ]+vredmaxu.vs[  ]+v4,v8,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+1c842257[     ]+vredmax.vs[   ]+v4,v8,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+10842257[     ]+vredminu.vs[  ]+v4,v8,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+14842257[     ]+vredmin.vs[   ]+v4,v8,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+04862257[     ]+vredand.vs[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+08862257[     ]+vredor.vs[    ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0c862257[     ]+vredxor.vs[   ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c2860257[     ]+vwredsumu.vs[         ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c6860257[     ]+vwredsum.vs[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c0860257[     ]+vwredsumu.vs[         ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c4860257[     ]+vwredsum.vs[  ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+0e861257[     ]+vfredosum.vs[         ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+06861257[     ]+vfredsum.vs[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+1e861257[     ]+vfredmax.vs[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+16861257[     ]+vfredmin.vs[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+0c861257[     ]+vfredosum.vs[         ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+04861257[     ]+vfredsum.vs[  ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+1c861257[     ]+vfredmax.vs[  ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+14861257[     ]+vfredmin.vs[  ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+ce861257[     ]+vfwredosum.vs[        ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+c6861257[     ]+vfwredsum.vs[         ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+cc861257[     ]+vfwredosum.vs[        ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+c4861257[     ]+vfwredsum.vs[         ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+66842257[     ]+vmmv.m[       ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+66842257[     ]+vmmv.m[       ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+6e422257[     ]+vmclr.m[      ]+v4
+[      ]+[0-9a-f]+:[   ]+7e422257[     ]+vmset.m[      ]+v4
+[      ]+[0-9a-f]+:[   ]+76842257[     ]+vmnot.m[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+66862257[     ]+vmand.mm[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+76862257[     ]+vmnand.mm[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+62862257[     ]+vmandnot.mm[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6e862257[     ]+vmxor.mm[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+6a862257[     ]+vmor.mm[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+7a862257[     ]+vmnor.mm[     ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+72862257[     ]+vmornot.mm[   ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+7e862257[     ]+vmxnor.mm[    ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+42c82557[     ]+vpopc.m[      ]+a0,v12
+[      ]+[0-9a-f]+:[   ]+42c8a557[     ]+vfirst.m[     ]+a0,v12
+[      ]+[0-9a-f]+:[   ]+5280a257[     ]+vmsbf.m[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+5281a257[     ]+vmsif.m[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+52812257[     ]+vmsof.m[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+52882257[     ]+viota.m[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+5208a257[     ]+vid.v[        ]+v4
+[      ]+[0-9a-f]+:[   ]+40c82557[     ]+vpopc.m[      ]+a0,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+40c8a557[     ]+vfirst.m[     ]+a0,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+5080a257[     ]+vmsbf.m[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+5081a257[     ]+vmsif.m[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+50812257[     ]+vmsof.m[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+50882257[     ]+viota.m[      ]+v4,v8,v0.t
+[      ]+[0-9a-f]+:[   ]+5008a257[     ]+vid.v[        ]+v4,v0.t
+[      ]+[0-9a-f]+:[   ]+42c02557[     ]+vmv.x.s[      ]+a0,v12
+[      ]+[0-9a-f]+:[   ]+42056257[     ]+vmv.s.x[      ]+v4,a0
+[      ]+[0-9a-f]+:[   ]+42801557[     ]+vfmv.f.s[     ]+fa0,v8
+[      ]+[0-9a-f]+:[   ]+4205d257[     ]+vfmv.s.f[     ]+v4,fa1
+[      ]+[0-9a-f]+:[   ]+3a85c257[     ]+vslideup.vx[  ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+3a803257[     ]+vslideup.vi[  ]+v4,v8,0
+[      ]+[0-9a-f]+:[   ]+3a8fb257[     ]+vslideup.vi[  ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+3e85c257[     ]+vslidedown.vx[        ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+3e803257[     ]+vslidedown.vi[        ]+v4,v8,0
+[      ]+[0-9a-f]+:[   ]+3e8fb257[     ]+vslidedown.vi[        ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+3885c257[     ]+vslideup.vx[  ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+38803257[     ]+vslideup.vi[  ]+v4,v8,0,v0.t
+[      ]+[0-9a-f]+:[   ]+388fb257[     ]+vslideup.vi[  ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+3c85c257[     ]+vslidedown.vx[        ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+3c803257[     ]+vslidedown.vi[        ]+v4,v8,0,v0.t
+[      ]+[0-9a-f]+:[   ]+3c8fb257[     ]+vslidedown.vi[        ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+3a85e257[     ]+vslide1up.vx[         ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+3e85e257[     ]+vslide1down.vx[       ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+3885e257[     ]+vslide1up.vx[         ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+3c85e257[     ]+vslide1down.vx[       ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+3a85d257[     ]+vfslide1up.vf[        ]+v4,v8,fa1
+[      ]+[0-9a-f]+:[   ]+3e85d257[     ]+vfslide1down.vf[      ]+v4,v8,fa1
+[      ]+[0-9a-f]+:[   ]+3885d257[     ]+vfslide1up.vf[        ]+v4,v8,fa1,v0.t
+[      ]+[0-9a-f]+:[   ]+3c85d257[     ]+vfslide1down.vf[      ]+v4,v8,fa1,v0.t
+[      ]+[0-9a-f]+:[   ]+32860257[     ]+vrgather.vv[  ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+3285c257[     ]+vrgather.vx[  ]+v4,v8,a1
+[      ]+[0-9a-f]+:[   ]+32803257[     ]+vrgather.vi[  ]+v4,v8,0
+[      ]+[0-9a-f]+:[   ]+328fb257[     ]+vrgather.vi[  ]+v4,v8,31
+[      ]+[0-9a-f]+:[   ]+30860257[     ]+vrgather.vv[  ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+3085c257[     ]+vrgather.vx[  ]+v4,v8,a1,v0.t
+[      ]+[0-9a-f]+:[   ]+30803257[     ]+vrgather.vi[  ]+v4,v8,0,v0.t
+[      ]+[0-9a-f]+:[   ]+308fb257[     ]+vrgather.vi[  ]+v4,v8,31,v0.t
+[      ]+[0-9a-f]+:[   ]+3a860257[     ]+vrgatherei16.vv[      ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+38860257[     ]+vrgatherei16.vv[      ]+v4,v8,v12,v0.t
+[      ]+[0-9a-f]+:[   ]+5e862257[     ]+vcompress.vm[         ]+v4,v8,v12
+[      ]+[0-9a-f]+:[   ]+9e2030d7[     ]+vmv1r.v[      ]+v1,v2
+[      ]+[0-9a-f]+:[   ]+9e40b157[     ]+vmv2r.v[      ]+v2,v4
+[      ]+[0-9a-f]+:[   ]+9e81b257[     ]+vmv4r.v[      ]+v4,v8
+[      ]+[0-9a-f]+:[   ]+9e83b057[     ]+vmv8r.v[      ]+v0,v8
diff --git a/gas/testsuite/gas/riscv/extended/vector-insns.s b/gas/testsuite/gas/riscv/extended/vector-insns.s
new file mode 100644 (file)
index 0000000..5c78e28
--- /dev/null
@@ -0,0 +1,2183 @@
+       vsetvl a0, a1, a2
+       vsetvli a0, a1, 0
+       vsetvli a0, a1, 0x7ff
+       vsetvli a0, a1, e16, m2
+       vsetvli a0, a1, e256, m8
+       vsetvli a0, a1, e512, m8
+       vsetvli a0, a1, e1024, m8
+       vsetvli a0, a1, e1024, m1
+       vsetvli a0, a1, e1024, mf2
+       vsetvli a0, a1, e512, mf4
+       vsetvli a0, a1, e256, mf8
+       vsetvli a0, a1, e256, m2, ta
+       vsetvli a0, a1, e256, m2, ma
+       vsetvli a0, a1, e256, m2, tu
+       vsetvli a0, a1, e256, m2, mu
+       vsetvli a0, a1, e256, m2, ta, ma
+       vsetvli a0, a1, e256, m2, tu, ma
+       vsetvli a0, a1, e256, m2, ta, mu
+       vsetvli a0, a1, e256, m2, tu, mu
+       vsetivli a0, 0xb, 0
+       vsetivli a0, 0xb, 0x3ff
+       vsetivli a0, 0xb, e16, m2
+       vsetivli a0, 0xb, e256, m8
+       vsetivli a0, 0xb, e512, m8
+       vsetivli a0, 0xb, e1024, m8
+       vsetivli a0, 0xb, e1024, m1
+       vsetivli a0, 0xb, e1024, mf2
+       vsetivli a0, 0xb, e512, mf4
+       vsetivli a0, 0xb, e256, mf8
+       vsetivli a0, 0xb, e256, m2, ta
+       vsetivli a0, 0xb, e256, m2, ma
+       vsetivli a0, 0xb, e256, m2, tu
+       vsetivli a0, 0xb, e256, m2, mu
+       vsetivli a0, 0xb, e256, m2, ta, ma
+       vsetivli a0, 0xb, e256, m2, tu, ma
+       vsetivli a0, 0xb, e256, m2, ta, mu
+       vsetivli a0, 0xb, e256, m2, tu, mu
+
+       vle1.v v4, (a0)
+       vle1.v v4, 0(a0)
+       vse1.v v4, (a0)
+       vse1.v v4, 0(a0)
+
+       vle8.v v4, (a0)
+       vle8.v v4, 0(a0)
+       vle8.v v4, (a0), v0.t
+       vse8.v v4, (a0)
+       vse8.v v4, 0(a0)
+       vse8.v v4, (a0), v0.t
+
+       vle16.v v4, (a0)
+       vle16.v v4, 0(a0)
+       vle16.v v4, (a0), v0.t
+       vse16.v v4, (a0)
+       vse16.v v4, 0(a0)
+       vse16.v v4, (a0), v0.t
+
+       vle32.v v4, (a0)
+       vle32.v v4, 0(a0)
+       vle32.v v4, (a0), v0.t
+       vse32.v v4, (a0)
+       vse32.v v4, 0(a0)
+       vse32.v v4, (a0), v0.t
+
+       vle64.v v4, (a0)
+       vle64.v v4, 0(a0)
+       vle64.v v4, (a0), v0.t
+       vse64.v v4, (a0)
+       vse64.v v4, 0(a0)
+       vse64.v v4, (a0), v0.t
+
+       vlse8.v v4, (a0), a1
+       vlse8.v v4, 0(a0), a1
+       vlse8.v v4, (a0), a1, v0.t
+       vsse8.v v4, (a0), a1
+       vsse8.v v4, 0(a0), a1
+       vsse8.v v4, (a0), a1, v0.t
+
+       vlse16.v v4, (a0), a1
+       vlse16.v v4, 0(a0), a1
+       vlse16.v v4, (a0), a1, v0.t
+       vsse16.v v4, (a0), a1
+       vsse16.v v4, 0(a0), a1
+       vsse16.v v4, (a0), a1, v0.t
+
+       vlse32.v v4, (a0), a1
+       vlse32.v v4, 0(a0), a1
+       vlse32.v v4, (a0), a1, v0.t
+       vsse32.v v4, (a0), a1
+       vsse32.v v4, 0(a0), a1
+       vsse32.v v4, (a0), a1, v0.t
+
+       vlse64.v v4, (a0), a1
+       vlse64.v v4, 0(a0), a1
+       vlse64.v v4, (a0), a1, v0.t
+       vsse64.v v4, (a0), a1
+       vsse64.v v4, 0(a0), a1
+       vsse64.v v4, (a0), a1, v0.t
+
+       vloxei8.v v4, (a0), v12
+       vloxei8.v v4, 0(a0), v12
+       vloxei8.v v4, (a0), v12, v0.t
+       vsoxei8.v v4, (a0), v12
+       vsoxei8.v v4, 0(a0), v12
+       vsoxei8.v v4, (a0), v12, v0.t
+       vluxei8.v v4, (a0), v12
+       vluxei8.v v4, 0(a0), v12
+       vluxei8.v v4, (a0), v12, v0.t
+       vsuxei8.v v4, (a0), v12
+       vsuxei8.v v4, 0(a0), v12
+       vsuxei8.v v4, (a0), v12, v0.t
+
+       vloxei16.v v4, (a0), v12
+       vloxei16.v v4, 0(a0), v12
+       vloxei16.v v4, (a0), v12, v0.t
+       vsoxei16.v v4, (a0), v12
+       vsoxei16.v v4, 0(a0), v12
+       vsoxei16.v v4, (a0), v12, v0.t
+       vluxei16.v v4, (a0), v12
+       vluxei16.v v4, 0(a0), v12
+       vluxei16.v v4, (a0), v12, v0.t
+       vsuxei16.v v4, (a0), v12
+       vsuxei16.v v4, 0(a0), v12
+       vsuxei16.v v4, (a0), v12, v0.t
+
+       vloxei32.v v4, (a0), v12
+       vloxei32.v v4, 0(a0), v12
+       vloxei32.v v4, (a0), v12, v0.t
+       vsoxei32.v v4, (a0), v12
+       vsoxei32.v v4, 0(a0), v12
+       vsoxei32.v v4, (a0), v12, v0.t
+       vluxei32.v v4, (a0), v12
+       vluxei32.v v4, 0(a0), v12
+       vluxei32.v v4, (a0), v12, v0.t
+       vsuxei32.v v4, (a0), v12
+       vsuxei32.v v4, 0(a0), v12
+       vsuxei32.v v4, (a0), v12, v0.t
+
+       vloxei64.v v4, (a0), v12
+       vloxei64.v v4, 0(a0), v12
+       vloxei64.v v4, (a0), v12, v0.t
+       vsoxei64.v v4, (a0), v12
+       vsoxei64.v v4, 0(a0), v12
+       vsoxei64.v v4, (a0), v12, v0.t
+       vluxei64.v v4, (a0), v12
+       vluxei64.v v4, 0(a0), v12
+       vluxei64.v v4, (a0), v12, v0.t
+       vsuxei64.v v4, (a0), v12
+       vsuxei64.v v4, 0(a0), v12
+       vsuxei64.v v4, (a0), v12, v0.t
+
+       vle8ff.v v4, (a0)
+       vle8ff.v v4, 0(a0)
+       vle8ff.v v4, (a0), v0.t
+
+       vle16ff.v v4, (a0)
+       vle16ff.v v4, 0(a0)
+       vle16ff.v v4, (a0), v0.t
+
+       vle32ff.v v4, (a0)
+       vle32ff.v v4, 0(a0)
+       vle32ff.v v4, (a0), v0.t
+
+       vle64ff.v v4, (a0)
+       vle64ff.v v4, 0(a0)
+       vle64ff.v v4, (a0), v0.t
+
+       vlseg2e8.v v4, (a0)
+       vlseg2e8.v v4, 0(a0)
+       vlseg2e8.v v4, (a0), v0.t
+       vsseg2e8.v v4, (a0)
+       vsseg2e8.v v4, 0(a0)
+       vsseg2e8.v v4, (a0), v0.t
+
+       vlseg3e8.v v4, (a0)
+       vlseg3e8.v v4, 0(a0)
+       vlseg3e8.v v4, (a0), v0.t
+       vsseg3e8.v v4, (a0)
+       vsseg3e8.v v4, 0(a0)
+       vsseg3e8.v v4, (a0), v0.t
+
+       vlseg4e8.v v4, (a0)
+       vlseg4e8.v v4, 0(a0)
+       vlseg4e8.v v4, (a0), v0.t
+       vsseg4e8.v v4, (a0)
+       vsseg4e8.v v4, 0(a0)
+       vsseg4e8.v v4, (a0), v0.t
+
+       vlseg5e8.v v4, (a0)
+       vlseg5e8.v v4, 0(a0)
+       vlseg5e8.v v4, (a0), v0.t
+       vsseg5e8.v v4, (a0)
+       vsseg5e8.v v4, 0(a0)
+       vsseg5e8.v v4, (a0), v0.t
+
+       vlseg6e8.v v4, (a0)
+       vlseg6e8.v v4, 0(a0)
+       vlseg6e8.v v4, (a0), v0.t
+       vsseg6e8.v v4, (a0)
+       vsseg6e8.v v4, 0(a0)
+       vsseg6e8.v v4, (a0), v0.t
+
+       vlseg7e8.v v4, (a0)
+       vlseg7e8.v v4, 0(a0)
+       vlseg7e8.v v4, (a0), v0.t
+       vsseg7e8.v v4, (a0)
+       vsseg7e8.v v4, 0(a0)
+       vsseg7e8.v v4, (a0), v0.t
+
+       vlseg8e8.v v4, (a0)
+       vlseg8e8.v v4, 0(a0)
+       vlseg8e8.v v4, (a0), v0.t
+       vsseg8e8.v v4, (a0)
+       vsseg8e8.v v4, 0(a0)
+       vsseg8e8.v v4, (a0), v0.t
+
+       vlseg2e16.v v4, (a0)
+       vlseg2e16.v v4, 0(a0)
+       vlseg2e16.v v4, (a0), v0.t
+       vsseg2e16.v v4, (a0)
+       vsseg2e16.v v4, 0(a0)
+       vsseg2e16.v v4, (a0), v0.t
+
+       vlseg3e16.v v4, (a0)
+       vlseg3e16.v v4, 0(a0)
+       vlseg3e16.v v4, (a0), v0.t
+       vsseg3e16.v v4, (a0)
+       vsseg3e16.v v4, 0(a0)
+       vsseg3e16.v v4, (a0), v0.t
+
+       vlseg4e16.v v4, (a0)
+       vlseg4e16.v v4, 0(a0)
+       vlseg4e16.v v4, (a0), v0.t
+       vsseg4e16.v v4, (a0)
+       vsseg4e16.v v4, 0(a0)
+       vsseg4e16.v v4, (a0), v0.t
+
+       vlseg5e16.v v4, (a0)
+       vlseg5e16.v v4, 0(a0)
+       vlseg5e16.v v4, (a0), v0.t
+       vsseg5e16.v v4, (a0)
+       vsseg5e16.v v4, 0(a0)
+       vsseg5e16.v v4, (a0), v0.t
+
+       vlseg6e16.v v4, (a0)
+       vlseg6e16.v v4, 0(a0)
+       vlseg6e16.v v4, (a0), v0.t
+       vsseg6e16.v v4, (a0)
+       vsseg6e16.v v4, 0(a0)
+       vsseg6e16.v v4, (a0), v0.t
+
+       vlseg7e16.v v4, (a0)
+       vlseg7e16.v v4, 0(a0)
+       vlseg7e16.v v4, (a0), v0.t
+       vsseg7e16.v v4, (a0)
+       vsseg7e16.v v4, 0(a0)
+       vsseg7e16.v v4, (a0), v0.t
+
+       vlseg8e16.v v4, (a0)
+       vlseg8e16.v v4, 0(a0)
+       vlseg8e16.v v4, (a0), v0.t
+       vsseg8e16.v v4, (a0)
+       vsseg8e16.v v4, 0(a0)
+       vsseg8e16.v v4, (a0), v0.t
+
+       vlseg2e32.v v4, (a0)
+       vlseg2e32.v v4, 0(a0)
+       vlseg2e32.v v4, (a0), v0.t
+       vsseg2e32.v v4, (a0)
+       vsseg2e32.v v4, 0(a0)
+       vsseg2e32.v v4, (a0), v0.t
+
+       vlseg3e32.v v4, (a0)
+       vlseg3e32.v v4, 0(a0)
+       vlseg3e32.v v4, (a0), v0.t
+       vsseg3e32.v v4, (a0)
+       vsseg3e32.v v4, 0(a0)
+       vsseg3e32.v v4, (a0), v0.t
+
+       vlseg4e32.v v4, (a0)
+       vlseg4e32.v v4, 0(a0)
+       vlseg4e32.v v4, (a0), v0.t
+       vsseg4e32.v v4, (a0)
+       vsseg4e32.v v4, 0(a0)
+       vsseg4e32.v v4, (a0), v0.t
+
+       vlseg5e32.v v4, (a0)
+       vlseg5e32.v v4, 0(a0)
+       vlseg5e32.v v4, (a0), v0.t
+       vsseg5e32.v v4, (a0)
+       vsseg5e32.v v4, 0(a0)
+       vsseg5e32.v v4, (a0), v0.t
+
+       vlseg6e32.v v4, (a0)
+       vlseg6e32.v v4, 0(a0)
+       vlseg6e32.v v4, (a0), v0.t
+       vsseg6e32.v v4, (a0)
+       vsseg6e32.v v4, 0(a0)
+       vsseg6e32.v v4, (a0), v0.t
+
+       vlseg7e32.v v4, (a0)
+       vlseg7e32.v v4, 0(a0)
+       vlseg7e32.v v4, (a0), v0.t
+       vsseg7e32.v v4, (a0)
+       vsseg7e32.v v4, 0(a0)
+       vsseg7e32.v v4, (a0), v0.t
+
+       vlseg8e32.v v4, (a0)
+       vlseg8e32.v v4, 0(a0)
+       vlseg8e32.v v4, (a0), v0.t
+       vsseg8e32.v v4, (a0)
+       vsseg8e32.v v4, 0(a0)
+       vsseg8e32.v v4, (a0), v0.t
+
+       vlseg2e64.v v4, (a0)
+       vlseg2e64.v v4, 0(a0)
+       vlseg2e64.v v4, (a0), v0.t
+       vsseg2e64.v v4, (a0)
+       vsseg2e64.v v4, 0(a0)
+       vsseg2e64.v v4, (a0), v0.t
+
+       vlseg3e64.v v4, (a0)
+       vlseg3e64.v v4, 0(a0)
+       vlseg3e64.v v4, (a0), v0.t
+       vsseg3e64.v v4, (a0)
+       vsseg3e64.v v4, 0(a0)
+       vsseg3e64.v v4, (a0), v0.t
+
+       vlseg4e64.v v4, (a0)
+       vlseg4e64.v v4, 0(a0)
+       vlseg4e64.v v4, (a0), v0.t
+       vsseg4e64.v v4, (a0)
+       vsseg4e64.v v4, 0(a0)
+       vsseg4e64.v v4, (a0), v0.t
+
+       vlseg5e64.v v4, (a0)
+       vlseg5e64.v v4, 0(a0)
+       vlseg5e64.v v4, (a0), v0.t
+       vsseg5e64.v v4, (a0)
+       vsseg5e64.v v4, 0(a0)
+       vsseg5e64.v v4, (a0), v0.t
+
+       vlseg6e64.v v4, (a0)
+       vlseg6e64.v v4, 0(a0)
+       vlseg6e64.v v4, (a0), v0.t
+       vsseg6e64.v v4, (a0)
+       vsseg6e64.v v4, 0(a0)
+       vsseg6e64.v v4, (a0), v0.t
+
+       vlseg7e64.v v4, (a0)
+       vlseg7e64.v v4, 0(a0)
+       vlseg7e64.v v4, (a0), v0.t
+       vsseg7e64.v v4, (a0)
+       vsseg7e64.v v4, 0(a0)
+       vsseg7e64.v v4, (a0), v0.t
+
+       vlseg8e64.v v4, (a0)
+       vlseg8e64.v v4, 0(a0)
+       vlseg8e64.v v4, (a0), v0.t
+       vsseg8e64.v v4, (a0)
+       vsseg8e64.v v4, 0(a0)
+       vsseg8e64.v v4, (a0), v0.t
+
+       vlsseg2e8.v v4, (a0), a1
+       vlsseg2e8.v v4, 0(a0), a1
+       vlsseg2e8.v v4, (a0), a1, v0.t
+       vssseg2e8.v v4, (a0), a1
+       vssseg2e8.v v4, 0(a0), a1
+       vssseg2e8.v v4, (a0), a1, v0.t
+
+       vlsseg3e8.v v4, (a0), a1
+       vlsseg3e8.v v4, 0(a0), a1
+       vlsseg3e8.v v4, (a0), a1, v0.t
+       vssseg3e8.v v4, (a0), a1
+       vssseg3e8.v v4, 0(a0), a1
+       vssseg3e8.v v4, (a0), a1, v0.t
+
+       vlsseg4e8.v v4, (a0), a1
+       vlsseg4e8.v v4, 0(a0), a1
+       vlsseg4e8.v v4, (a0), a1, v0.t
+       vssseg4e8.v v4, (a0), a1
+       vssseg4e8.v v4, 0(a0), a1
+       vssseg4e8.v v4, (a0), a1, v0.t
+
+       vlsseg5e8.v v4, (a0), a1
+       vlsseg5e8.v v4, 0(a0), a1
+       vlsseg5e8.v v4, (a0), a1, v0.t
+       vssseg5e8.v v4, (a0), a1
+       vssseg5e8.v v4, 0(a0), a1
+       vssseg5e8.v v4, (a0), a1, v0.t
+
+       vlsseg6e8.v v4, (a0), a1
+       vlsseg6e8.v v4, 0(a0), a1
+       vlsseg6e8.v v4, (a0), a1, v0.t
+       vssseg6e8.v v4, (a0), a1
+       vssseg6e8.v v4, 0(a0), a1
+       vssseg6e8.v v4, (a0), a1, v0.t
+
+       vlsseg7e8.v v4, (a0), a1
+       vlsseg7e8.v v4, 0(a0), a1
+       vlsseg7e8.v v4, (a0), a1, v0.t
+       vssseg7e8.v v4, (a0), a1
+       vssseg7e8.v v4, 0(a0), a1
+       vssseg7e8.v v4, (a0), a1, v0.t
+
+       vlsseg8e8.v v4, (a0), a1
+       vlsseg8e8.v v4, 0(a0), a1
+       vlsseg8e8.v v4, (a0), a1, v0.t
+       vssseg8e8.v v4, (a0), a1
+       vssseg8e8.v v4, 0(a0), a1
+       vssseg8e8.v v4, (a0), a1, v0.t
+
+       vlsseg2e16.v v4, (a0), a1
+       vlsseg2e16.v v4, 0(a0), a1
+       vlsseg2e16.v v4, (a0), a1, v0.t
+       vssseg2e16.v v4, (a0), a1
+       vssseg2e16.v v4, 0(a0), a1
+       vssseg2e16.v v4, (a0), a1, v0.t
+
+       vlsseg3e16.v v4, (a0), a1
+       vlsseg3e16.v v4, 0(a0), a1
+       vlsseg3e16.v v4, (a0), a1, v0.t
+       vssseg3e16.v v4, (a0), a1
+       vssseg3e16.v v4, 0(a0), a1
+       vssseg3e16.v v4, (a0), a1, v0.t
+
+       vlsseg4e16.v v4, (a0), a1
+       vlsseg4e16.v v4, 0(a0), a1
+       vlsseg4e16.v v4, (a0), a1, v0.t
+       vssseg4e16.v v4, (a0), a1
+       vssseg4e16.v v4, 0(a0), a1
+       vssseg4e16.v v4, (a0), a1, v0.t
+
+       vlsseg5e16.v v4, (a0), a1
+       vlsseg5e16.v v4, 0(a0), a1
+       vlsseg5e16.v v4, (a0), a1, v0.t
+       vssseg5e16.v v4, (a0), a1
+       vssseg5e16.v v4, 0(a0), a1
+       vssseg5e16.v v4, (a0), a1, v0.t
+
+       vlsseg6e16.v v4, (a0), a1
+       vlsseg6e16.v v4, 0(a0), a1
+       vlsseg6e16.v v4, (a0), a1, v0.t
+       vssseg6e16.v v4, (a0), a1
+       vssseg6e16.v v4, 0(a0), a1
+       vssseg6e16.v v4, (a0), a1, v0.t
+
+       vlsseg7e16.v v4, (a0), a1
+       vlsseg7e16.v v4, 0(a0), a1
+       vlsseg7e16.v v4, (a0), a1, v0.t
+       vssseg7e16.v v4, (a0), a1
+       vssseg7e16.v v4, 0(a0), a1
+       vssseg7e16.v v4, (a0), a1, v0.t
+
+       vlsseg8e16.v v4, (a0), a1
+       vlsseg8e16.v v4, 0(a0), a1
+       vlsseg8e16.v v4, (a0), a1, v0.t
+       vssseg8e16.v v4, (a0), a1
+       vssseg8e16.v v4, 0(a0), a1
+       vssseg8e16.v v4, (a0), a1, v0.t
+
+       vlsseg2e32.v v4, (a0), a1
+       vlsseg2e32.v v4, 0(a0), a1
+       vlsseg2e32.v v4, (a0), a1, v0.t
+       vssseg2e32.v v4, (a0), a1
+       vssseg2e32.v v4, 0(a0), a1
+       vssseg2e32.v v4, (a0), a1, v0.t
+
+       vlsseg3e32.v v4, (a0), a1
+       vlsseg3e32.v v4, 0(a0), a1
+       vlsseg3e32.v v4, (a0), a1, v0.t
+       vssseg3e32.v v4, (a0), a1
+       vssseg3e32.v v4, 0(a0), a1
+       vssseg3e32.v v4, (a0), a1, v0.t
+
+       vlsseg4e32.v v4, (a0), a1
+       vlsseg4e32.v v4, 0(a0), a1
+       vlsseg4e32.v v4, (a0), a1, v0.t
+       vssseg4e32.v v4, (a0), a1
+       vssseg4e32.v v4, 0(a0), a1
+       vssseg4e32.v v4, (a0), a1, v0.t
+
+       vlsseg5e32.v v4, (a0), a1
+       vlsseg5e32.v v4, 0(a0), a1
+       vlsseg5e32.v v4, (a0), a1, v0.t
+       vssseg5e32.v v4, (a0), a1
+       vssseg5e32.v v4, 0(a0), a1
+       vssseg5e32.v v4, (a0), a1, v0.t
+
+       vlsseg6e32.v v4, (a0), a1
+       vlsseg6e32.v v4, 0(a0), a1
+       vlsseg6e32.v v4, (a0), a1, v0.t
+       vssseg6e32.v v4, (a0), a1
+       vssseg6e32.v v4, 0(a0), a1
+       vssseg6e32.v v4, (a0), a1, v0.t
+
+       vlsseg7e32.v v4, (a0), a1
+       vlsseg7e32.v v4, 0(a0), a1
+       vlsseg7e32.v v4, (a0), a1, v0.t
+       vssseg7e32.v v4, (a0), a1
+       vssseg7e32.v v4, 0(a0), a1
+       vssseg7e32.v v4, (a0), a1, v0.t
+
+       vlsseg8e32.v v4, (a0), a1
+       vlsseg8e32.v v4, 0(a0), a1
+       vlsseg8e32.v v4, (a0), a1, v0.t
+       vssseg8e32.v v4, (a0), a1
+       vssseg8e32.v v4, 0(a0), a1
+       vssseg8e32.v v4, (a0), a1, v0.t
+
+       vlsseg2e64.v v4, (a0), a1
+       vlsseg2e64.v v4, 0(a0), a1
+       vlsseg2e64.v v4, (a0), a1, v0.t
+       vssseg2e64.v v4, (a0), a1
+       vssseg2e64.v v4, 0(a0), a1
+       vssseg2e64.v v4, (a0), a1, v0.t
+
+       vlsseg3e64.v v4, (a0), a1
+       vlsseg3e64.v v4, 0(a0), a1
+       vlsseg3e64.v v4, (a0), a1, v0.t
+       vssseg3e64.v v4, (a0), a1
+       vssseg3e64.v v4, 0(a0), a1
+       vssseg3e64.v v4, (a0), a1, v0.t
+
+       vlsseg4e64.v v4, (a0), a1
+       vlsseg4e64.v v4, 0(a0), a1
+       vlsseg4e64.v v4, (a0), a1, v0.t
+       vssseg4e64.v v4, (a0), a1
+       vssseg4e64.v v4, 0(a0), a1
+       vssseg4e64.v v4, (a0), a1, v0.t
+
+       vlsseg5e64.v v4, (a0), a1
+       vlsseg5e64.v v4, 0(a0), a1
+       vlsseg5e64.v v4, (a0), a1, v0.t
+       vssseg5e64.v v4, (a0), a1
+       vssseg5e64.v v4, 0(a0), a1
+       vssseg5e64.v v4, (a0), a1, v0.t
+
+       vlsseg6e64.v v4, (a0), a1
+       vlsseg6e64.v v4, 0(a0), a1
+       vlsseg6e64.v v4, (a0), a1, v0.t
+       vssseg6e64.v v4, (a0), a1
+       vssseg6e64.v v4, 0(a0), a1
+       vssseg6e64.v v4, (a0), a1, v0.t
+
+       vlsseg7e64.v v4, (a0), a1
+       vlsseg7e64.v v4, 0(a0), a1
+       vlsseg7e64.v v4, (a0), a1, v0.t
+       vssseg7e64.v v4, (a0), a1
+       vssseg7e64.v v4, 0(a0), a1
+       vssseg7e64.v v4, (a0), a1, v0.t
+
+       vlsseg8e64.v v4, (a0), a1
+       vlsseg8e64.v v4, 0(a0), a1
+       vlsseg8e64.v v4, (a0), a1, v0.t
+       vssseg8e64.v v4, (a0), a1
+       vssseg8e64.v v4, 0(a0), a1
+       vssseg8e64.v v4, (a0), a1, v0.t
+
+       vloxseg2ei8.v v4, (a0), v12
+       vloxseg2ei8.v v4, 0(a0), v12
+       vloxseg2ei8.v v4, (a0), v12, v0.t
+       vsoxseg2ei8.v v4, (a0), v12
+       vsoxseg2ei8.v v4, 0(a0), v12
+       vsoxseg2ei8.v v4, (a0), v12, v0.t
+
+       vloxseg3ei8.v v4, (a0), v12
+       vloxseg3ei8.v v4, 0(a0), v12
+       vloxseg3ei8.v v4, (a0), v12, v0.t
+       vsoxseg3ei8.v v4, (a0), v12
+       vsoxseg3ei8.v v4, 0(a0), v12
+       vsoxseg3ei8.v v4, (a0), v12, v0.t
+
+       vloxseg4ei8.v v4, (a0), v12
+       vloxseg4ei8.v v4, 0(a0), v12
+       vloxseg4ei8.v v4, (a0), v12, v0.t
+       vsoxseg4ei8.v v4, (a0), v12
+       vsoxseg4ei8.v v4, 0(a0), v12
+       vsoxseg4ei8.v v4, (a0), v12, v0.t
+
+       vloxseg5ei8.v v4, (a0), v12
+       vloxseg5ei8.v v4, 0(a0), v12
+       vloxseg5ei8.v v4, (a0), v12, v0.t
+       vsoxseg5ei8.v v4, (a0), v12
+       vsoxseg5ei8.v v4, 0(a0), v12
+       vsoxseg5ei8.v v4, (a0), v12, v0.t
+
+       vloxseg6ei8.v v4, (a0), v12
+       vloxseg6ei8.v v4, 0(a0), v12
+       vloxseg6ei8.v v4, (a0), v12, v0.t
+       vsoxseg6ei8.v v4, (a0), v12
+       vsoxseg6ei8.v v4, 0(a0), v12
+       vsoxseg6ei8.v v4, (a0), v12, v0.t
+
+       vloxseg7ei8.v v4, (a0), v12
+       vloxseg7ei8.v v4, 0(a0), v12
+       vloxseg7ei8.v v4, (a0), v12, v0.t
+       vsoxseg7ei8.v v4, (a0), v12
+       vsoxseg7ei8.v v4, 0(a0), v12
+       vsoxseg7ei8.v v4, (a0), v12, v0.t
+
+       vloxseg8ei8.v v4, (a0), v12
+       vloxseg8ei8.v v4, 0(a0), v12
+       vloxseg8ei8.v v4, (a0), v12, v0.t
+       vsoxseg8ei8.v v4, (a0), v12
+       vsoxseg8ei8.v v4, 0(a0), v12
+       vsoxseg8ei8.v v4, (a0), v12, v0.t
+
+       vloxseg2ei16.v v4, (a0), v12
+       vloxseg2ei16.v v4, 0(a0), v12
+       vloxseg2ei16.v v4, (a0), v12, v0.t
+       vsoxseg2ei16.v v4, (a0), v12
+       vsoxseg2ei16.v v4, 0(a0), v12
+       vsoxseg2ei16.v v4, (a0), v12, v0.t
+
+       vloxseg3ei16.v v4, (a0), v12
+       vloxseg3ei16.v v4, 0(a0), v12
+       vloxseg3ei16.v v4, (a0), v12, v0.t
+       vsoxseg3ei16.v v4, (a0), v12
+       vsoxseg3ei16.v v4, 0(a0), v12
+       vsoxseg3ei16.v v4, (a0), v12, v0.t
+
+       vloxseg4ei16.v v4, (a0), v12
+       vloxseg4ei16.v v4, 0(a0), v12
+       vloxseg4ei16.v v4, (a0), v12, v0.t
+       vsoxseg4ei16.v v4, (a0), v12
+       vsoxseg4ei16.v v4, 0(a0), v12
+       vsoxseg4ei16.v v4, (a0), v12, v0.t
+
+       vloxseg5ei16.v v4, (a0), v12
+       vloxseg5ei16.v v4, 0(a0), v12
+       vloxseg5ei16.v v4, (a0), v12, v0.t
+       vsoxseg5ei16.v v4, (a0), v12
+       vsoxseg5ei16.v v4, 0(a0), v12
+       vsoxseg5ei16.v v4, (a0), v12, v0.t
+
+       vloxseg6ei16.v v4, (a0), v12
+       vloxseg6ei16.v v4, 0(a0), v12
+       vloxseg6ei16.v v4, (a0), v12, v0.t
+       vsoxseg6ei16.v v4, (a0), v12
+       vsoxseg6ei16.v v4, 0(a0), v12
+       vsoxseg6ei16.v v4, (a0), v12, v0.t
+
+       vloxseg7ei16.v v4, (a0), v12
+       vloxseg7ei16.v v4, 0(a0), v12
+       vloxseg7ei16.v v4, (a0), v12, v0.t
+       vsoxseg7ei16.v v4, (a0), v12
+       vsoxseg7ei16.v v4, 0(a0), v12
+       vsoxseg7ei16.v v4, (a0), v12, v0.t
+
+       vloxseg8ei16.v v4, (a0), v12
+       vloxseg8ei16.v v4, 0(a0), v12
+       vloxseg8ei16.v v4, (a0), v12, v0.t
+       vsoxseg8ei16.v v4, (a0), v12
+       vsoxseg8ei16.v v4, 0(a0), v12
+       vsoxseg8ei16.v v4, (a0), v12, v0.t
+
+       vloxseg2ei32.v v4, (a0), v12
+       vloxseg2ei32.v v4, 0(a0), v12
+       vloxseg2ei32.v v4, (a0), v12, v0.t
+       vsoxseg2ei32.v v4, (a0), v12
+       vsoxseg2ei32.v v4, 0(a0), v12
+       vsoxseg2ei32.v v4, (a0), v12, v0.t
+
+       vloxseg3ei32.v v4, (a0), v12
+       vloxseg3ei32.v v4, 0(a0), v12
+       vloxseg3ei32.v v4, (a0), v12, v0.t
+       vsoxseg3ei32.v v4, (a0), v12
+       vsoxseg3ei32.v v4, 0(a0), v12
+       vsoxseg3ei32.v v4, (a0), v12, v0.t
+
+       vloxseg4ei32.v v4, (a0), v12
+       vloxseg4ei32.v v4, 0(a0), v12
+       vloxseg4ei32.v v4, (a0), v12, v0.t
+       vsoxseg4ei32.v v4, (a0), v12
+       vsoxseg4ei32.v v4, 0(a0), v12
+       vsoxseg4ei32.v v4, (a0), v12, v0.t
+
+       vloxseg5ei32.v v4, (a0), v12
+       vloxseg5ei32.v v4, 0(a0), v12
+       vloxseg5ei32.v v4, (a0), v12, v0.t
+       vsoxseg5ei32.v v4, (a0), v12
+       vsoxseg5ei32.v v4, 0(a0), v12
+       vsoxseg5ei32.v v4, (a0), v12, v0.t
+
+       vloxseg6ei32.v v4, (a0), v12
+       vloxseg6ei32.v v4, 0(a0), v12
+       vloxseg6ei32.v v4, (a0), v12, v0.t
+       vsoxseg6ei32.v v4, (a0), v12
+       vsoxseg6ei32.v v4, 0(a0), v12
+       vsoxseg6ei32.v v4, (a0), v12, v0.t
+
+       vloxseg7ei32.v v4, (a0), v12
+       vloxseg7ei32.v v4, 0(a0), v12
+       vloxseg7ei32.v v4, (a0), v12, v0.t
+       vsoxseg7ei32.v v4, (a0), v12
+       vsoxseg7ei32.v v4, 0(a0), v12
+       vsoxseg7ei32.v v4, (a0), v12, v0.t
+
+       vloxseg8ei32.v v4, (a0), v12
+       vloxseg8ei32.v v4, 0(a0), v12
+       vloxseg8ei32.v v4, (a0), v12, v0.t
+       vsoxseg8ei32.v v4, (a0), v12
+       vsoxseg8ei32.v v4, 0(a0), v12
+       vsoxseg8ei32.v v4, (a0), v12, v0.t
+
+       vloxseg2ei64.v v4, (a0), v12
+       vloxseg2ei64.v v4, 0(a0), v12
+       vloxseg2ei64.v v4, (a0), v12, v0.t
+       vsoxseg2ei64.v v4, (a0), v12
+       vsoxseg2ei64.v v4, 0(a0), v12
+       vsoxseg2ei64.v v4, (a0), v12, v0.t
+
+       vloxseg3ei64.v v4, (a0), v12
+       vloxseg3ei64.v v4, 0(a0), v12
+       vloxseg3ei64.v v4, (a0), v12, v0.t
+       vsoxseg3ei64.v v4, (a0), v12
+       vsoxseg3ei64.v v4, 0(a0), v12
+       vsoxseg3ei64.v v4, (a0), v12, v0.t
+
+       vloxseg4ei64.v v4, (a0), v12
+       vloxseg4ei64.v v4, 0(a0), v12
+       vloxseg4ei64.v v4, (a0), v12, v0.t
+       vsoxseg4ei64.v v4, (a0), v12
+       vsoxseg4ei64.v v4, 0(a0), v12
+       vsoxseg4ei64.v v4, (a0), v12, v0.t
+
+       vloxseg5ei64.v v4, (a0), v12
+       vloxseg5ei64.v v4, 0(a0), v12
+       vloxseg5ei64.v v4, (a0), v12, v0.t
+       vsoxseg5ei64.v v4, (a0), v12
+       vsoxseg5ei64.v v4, 0(a0), v12
+       vsoxseg5ei64.v v4, (a0), v12, v0.t
+
+       vloxseg6ei64.v v4, (a0), v12
+       vloxseg6ei64.v v4, 0(a0), v12
+       vloxseg6ei64.v v4, (a0), v12, v0.t
+       vsoxseg6ei64.v v4, (a0), v12
+       vsoxseg6ei64.v v4, 0(a0), v12
+       vsoxseg6ei64.v v4, (a0), v12, v0.t
+
+       vloxseg7ei64.v v4, (a0), v12
+       vloxseg7ei64.v v4, 0(a0), v12
+       vloxseg7ei64.v v4, (a0), v12, v0.t
+       vsoxseg7ei64.v v4, (a0), v12
+       vsoxseg7ei64.v v4, 0(a0), v12
+       vsoxseg7ei64.v v4, (a0), v12, v0.t
+
+       vloxseg8ei64.v v4, (a0), v12
+       vloxseg8ei64.v v4, 0(a0), v12
+       vloxseg8ei64.v v4, (a0), v12, v0.t
+       vsoxseg8ei64.v v4, (a0), v12
+       vsoxseg8ei64.v v4, 0(a0), v12
+       vsoxseg8ei64.v v4, (a0), v12, v0.t
+
+       vluxseg2ei8.v v4, (a0), v12
+       vluxseg2ei8.v v4, 0(a0), v12
+       vluxseg2ei8.v v4, (a0), v12, v0.t
+       vsuxseg2ei8.v v4, (a0), v12
+       vsuxseg2ei8.v v4, 0(a0), v12
+       vsuxseg2ei8.v v4, (a0), v12, v0.t
+
+       vluxseg3ei8.v v4, (a0), v12
+       vluxseg3ei8.v v4, 0(a0), v12
+       vluxseg3ei8.v v4, (a0), v12, v0.t
+       vsuxseg3ei8.v v4, (a0), v12
+       vsuxseg3ei8.v v4, 0(a0), v12
+       vsuxseg3ei8.v v4, (a0), v12, v0.t
+
+       vluxseg4ei8.v v4, (a0), v12
+       vluxseg4ei8.v v4, 0(a0), v12
+       vluxseg4ei8.v v4, (a0), v12, v0.t
+       vsuxseg4ei8.v v4, (a0), v12
+       vsuxseg4ei8.v v4, 0(a0), v12
+       vsuxseg4ei8.v v4, (a0), v12, v0.t
+
+       vluxseg5ei8.v v4, (a0), v12
+       vluxseg5ei8.v v4, 0(a0), v12
+       vluxseg5ei8.v v4, (a0), v12, v0.t
+       vsuxseg5ei8.v v4, (a0), v12
+       vsuxseg5ei8.v v4, 0(a0), v12
+       vsuxseg5ei8.v v4, (a0), v12, v0.t
+
+       vluxseg6ei8.v v4, (a0), v12
+       vluxseg6ei8.v v4, 0(a0), v12
+       vluxseg6ei8.v v4, (a0), v12, v0.t
+       vsuxseg6ei8.v v4, (a0), v12
+       vsuxseg6ei8.v v4, 0(a0), v12
+       vsuxseg6ei8.v v4, (a0), v12, v0.t
+
+       vluxseg7ei8.v v4, (a0), v12
+       vluxseg7ei8.v v4, 0(a0), v12
+       vluxseg7ei8.v v4, (a0), v12, v0.t
+       vsuxseg7ei8.v v4, (a0), v12
+       vsuxseg7ei8.v v4, 0(a0), v12
+       vsuxseg7ei8.v v4, (a0), v12, v0.t
+
+       vluxseg8ei8.v v4, (a0), v12
+       vluxseg8ei8.v v4, 0(a0), v12
+       vluxseg8ei8.v v4, (a0), v12, v0.t
+       vsuxseg8ei8.v v4, (a0), v12
+       vsuxseg8ei8.v v4, 0(a0), v12
+       vsuxseg8ei8.v v4, (a0), v12, v0.t
+
+       vluxseg2ei16.v v4, (a0), v12
+       vluxseg2ei16.v v4, 0(a0), v12
+       vluxseg2ei16.v v4, (a0), v12, v0.t
+       vsuxseg2ei16.v v4, (a0), v12
+       vsuxseg2ei16.v v4, 0(a0), v12
+       vsuxseg2ei16.v v4, (a0), v12, v0.t
+
+       vluxseg3ei16.v v4, (a0), v12
+       vluxseg3ei16.v v4, 0(a0), v12
+       vluxseg3ei16.v v4, (a0), v12, v0.t
+       vsuxseg3ei16.v v4, (a0), v12
+       vsuxseg3ei16.v v4, 0(a0), v12
+       vsuxseg3ei16.v v4, (a0), v12, v0.t
+
+       vluxseg4ei16.v v4, (a0), v12
+       vluxseg4ei16.v v4, 0(a0), v12
+       vluxseg4ei16.v v4, (a0), v12, v0.t
+       vsuxseg4ei16.v v4, (a0), v12
+       vsuxseg4ei16.v v4, 0(a0), v12
+       vsuxseg4ei16.v v4, (a0), v12, v0.t
+
+       vluxseg5ei16.v v4, (a0), v12
+       vluxseg5ei16.v v4, 0(a0), v12
+       vluxseg5ei16.v v4, (a0), v12, v0.t
+       vsuxseg5ei16.v v4, (a0), v12
+       vsuxseg5ei16.v v4, 0(a0), v12
+       vsuxseg5ei16.v v4, (a0), v12, v0.t
+
+       vluxseg6ei16.v v4, (a0), v12
+       vluxseg6ei16.v v4, 0(a0), v12
+       vluxseg6ei16.v v4, (a0), v12, v0.t
+       vsuxseg6ei16.v v4, (a0), v12
+       vsuxseg6ei16.v v4, 0(a0), v12
+       vsuxseg6ei16.v v4, (a0), v12, v0.t
+
+       vluxseg7ei16.v v4, (a0), v12
+       vluxseg7ei16.v v4, 0(a0), v12
+       vluxseg7ei16.v v4, (a0), v12, v0.t
+       vsuxseg7ei16.v v4, (a0), v12
+       vsuxseg7ei16.v v4, 0(a0), v12
+       vsuxseg7ei16.v v4, (a0), v12, v0.t
+
+       vluxseg8ei16.v v4, (a0), v12
+       vluxseg8ei16.v v4, 0(a0), v12
+       vluxseg8ei16.v v4, (a0), v12, v0.t
+       vsuxseg8ei16.v v4, (a0), v12
+       vsuxseg8ei16.v v4, 0(a0), v12
+       vsuxseg8ei16.v v4, (a0), v12, v0.t
+
+       vluxseg2ei32.v v4, (a0), v12
+       vluxseg2ei32.v v4, 0(a0), v12
+       vluxseg2ei32.v v4, (a0), v12, v0.t
+       vsuxseg2ei32.v v4, (a0), v12
+       vsuxseg2ei32.v v4, 0(a0), v12
+       vsuxseg2ei32.v v4, (a0), v12, v0.t
+
+       vluxseg3ei32.v v4, (a0), v12
+       vluxseg3ei32.v v4, 0(a0), v12
+       vluxseg3ei32.v v4, (a0), v12, v0.t
+       vsuxseg3ei32.v v4, (a0), v12
+       vsuxseg3ei32.v v4, 0(a0), v12
+       vsuxseg3ei32.v v4, (a0), v12, v0.t
+
+       vluxseg4ei32.v v4, (a0), v12
+       vluxseg4ei32.v v4, 0(a0), v12
+       vluxseg4ei32.v v4, (a0), v12, v0.t
+       vsuxseg4ei32.v v4, (a0), v12
+       vsuxseg4ei32.v v4, 0(a0), v12
+       vsuxseg4ei32.v v4, (a0), v12, v0.t
+
+       vluxseg5ei32.v v4, (a0), v12
+       vluxseg5ei32.v v4, 0(a0), v12
+       vluxseg5ei32.v v4, (a0), v12, v0.t
+       vsuxseg5ei32.v v4, (a0), v12
+       vsuxseg5ei32.v v4, 0(a0), v12
+       vsuxseg5ei32.v v4, (a0), v12, v0.t
+
+       vluxseg6ei32.v v4, (a0), v12
+       vluxseg6ei32.v v4, 0(a0), v12
+       vluxseg6ei32.v v4, (a0), v12, v0.t
+       vsuxseg6ei32.v v4, (a0), v12
+       vsuxseg6ei32.v v4, 0(a0), v12
+       vsuxseg6ei32.v v4, (a0), v12, v0.t
+
+       vluxseg7ei32.v v4, (a0), v12
+       vluxseg7ei32.v v4, 0(a0), v12
+       vluxseg7ei32.v v4, (a0), v12, v0.t
+       vsuxseg7ei32.v v4, (a0), v12
+       vsuxseg7ei32.v v4, 0(a0), v12
+       vsuxseg7ei32.v v4, (a0), v12, v0.t
+
+       vluxseg8ei32.v v4, (a0), v12
+       vluxseg8ei32.v v4, 0(a0), v12
+       vluxseg8ei32.v v4, (a0), v12, v0.t
+       vsuxseg8ei32.v v4, (a0), v12
+       vsuxseg8ei32.v v4, 0(a0), v12
+       vsuxseg8ei32.v v4, (a0), v12, v0.t
+
+       vluxseg2ei64.v v4, (a0), v12
+       vluxseg2ei64.v v4, 0(a0), v12
+       vluxseg2ei64.v v4, (a0), v12, v0.t
+       vsuxseg2ei64.v v4, (a0), v12
+       vsuxseg2ei64.v v4, 0(a0), v12
+       vsuxseg2ei64.v v4, (a0), v12, v0.t
+
+       vluxseg3ei64.v v4, (a0), v12
+       vluxseg3ei64.v v4, 0(a0), v12
+       vluxseg3ei64.v v4, (a0), v12, v0.t
+       vsuxseg3ei64.v v4, (a0), v12
+       vsuxseg3ei64.v v4, 0(a0), v12
+       vsuxseg3ei64.v v4, (a0), v12, v0.t
+
+       vluxseg4ei64.v v4, (a0), v12
+       vluxseg4ei64.v v4, 0(a0), v12
+       vluxseg4ei64.v v4, (a0), v12, v0.t
+       vsuxseg4ei64.v v4, (a0), v12
+       vsuxseg4ei64.v v4, 0(a0), v12
+       vsuxseg4ei64.v v4, (a0), v12, v0.t
+
+       vluxseg5ei64.v v4, (a0), v12
+       vluxseg5ei64.v v4, 0(a0), v12
+       vluxseg5ei64.v v4, (a0), v12, v0.t
+       vsuxseg5ei64.v v4, (a0), v12
+       vsuxseg5ei64.v v4, 0(a0), v12
+       vsuxseg5ei64.v v4, (a0), v12, v0.t
+
+       vluxseg6ei64.v v4, (a0), v12
+       vluxseg6ei64.v v4, 0(a0), v12
+       vluxseg6ei64.v v4, (a0), v12, v0.t
+       vsuxseg6ei64.v v4, (a0), v12
+       vsuxseg6ei64.v v4, 0(a0), v12
+       vsuxseg6ei64.v v4, (a0), v12, v0.t
+
+       vluxseg7ei64.v v4, (a0), v12
+       vluxseg7ei64.v v4, 0(a0), v12
+       vluxseg7ei64.v v4, (a0), v12, v0.t
+       vsuxseg7ei64.v v4, (a0), v12
+       vsuxseg7ei64.v v4, 0(a0), v12
+       vsuxseg7ei64.v v4, (a0), v12, v0.t
+
+       vluxseg8ei64.v v4, (a0), v12
+       vluxseg8ei64.v v4, 0(a0), v12
+       vluxseg8ei64.v v4, (a0), v12, v0.t
+       vsuxseg8ei64.v v4, (a0), v12
+       vsuxseg8ei64.v v4, 0(a0), v12
+       vsuxseg8ei64.v v4, (a0), v12, v0.t
+
+       vlseg2e8ff.v v4, (a0)
+       vlseg2e8ff.v v4, 0(a0)
+       vlseg2e8ff.v v4, (a0), v0.t
+
+       vlseg3e8ff.v v4, (a0)
+       vlseg3e8ff.v v4, 0(a0)
+       vlseg3e8ff.v v4, (a0), v0.t
+
+       vlseg4e8ff.v v4, (a0)
+       vlseg4e8ff.v v4, 0(a0)
+       vlseg4e8ff.v v4, (a0), v0.t
+
+       vlseg5e8ff.v v4, (a0)
+       vlseg5e8ff.v v4, 0(a0)
+       vlseg5e8ff.v v4, (a0), v0.t
+
+       vlseg6e8ff.v v4, (a0)
+       vlseg6e8ff.v v4, 0(a0)
+       vlseg6e8ff.v v4, (a0), v0.t
+
+       vlseg7e8ff.v v4, (a0)
+       vlseg7e8ff.v v4, 0(a0)
+       vlseg7e8ff.v v4, (a0), v0.t
+
+       vlseg8e8ff.v v4, (a0)
+       vlseg8e8ff.v v4, 0(a0)
+       vlseg8e8ff.v v4, (a0), v0.t
+
+       vlseg2e16ff.v v4, (a0)
+       vlseg2e16ff.v v4, 0(a0)
+       vlseg2e16ff.v v4, (a0), v0.t
+
+       vlseg3e16ff.v v4, (a0)
+       vlseg3e16ff.v v4, 0(a0)
+       vlseg3e16ff.v v4, (a0), v0.t
+
+       vlseg4e16ff.v v4, (a0)
+       vlseg4e16ff.v v4, 0(a0)
+       vlseg4e16ff.v v4, (a0), v0.t
+
+       vlseg5e16ff.v v4, (a0)
+       vlseg5e16ff.v v4, 0(a0)
+       vlseg5e16ff.v v4, (a0), v0.t
+
+       vlseg6e16ff.v v4, (a0)
+       vlseg6e16ff.v v4, 0(a0)
+       vlseg6e16ff.v v4, (a0), v0.t
+
+       vlseg7e16ff.v v4, (a0)
+       vlseg7e16ff.v v4, 0(a0)
+       vlseg7e16ff.v v4, (a0), v0.t
+
+       vlseg8e16ff.v v4, (a0)
+       vlseg8e16ff.v v4, 0(a0)
+       vlseg8e16ff.v v4, (a0), v0.t
+
+       vlseg2e32ff.v v4, (a0)
+       vlseg2e32ff.v v4, 0(a0)
+       vlseg2e32ff.v v4, (a0), v0.t
+
+       vlseg3e32ff.v v4, (a0)
+       vlseg3e32ff.v v4, 0(a0)
+       vlseg3e32ff.v v4, (a0), v0.t
+
+       vlseg4e32ff.v v4, (a0)
+       vlseg4e32ff.v v4, 0(a0)
+       vlseg4e32ff.v v4, (a0), v0.t
+
+       vlseg5e32ff.v v4, (a0)
+       vlseg5e32ff.v v4, 0(a0)
+       vlseg5e32ff.v v4, (a0), v0.t
+
+       vlseg6e32ff.v v4, (a0)
+       vlseg6e32ff.v v4, 0(a0)
+       vlseg6e32ff.v v4, (a0), v0.t
+
+       vlseg7e32ff.v v4, (a0)
+       vlseg7e32ff.v v4, 0(a0)
+       vlseg7e32ff.v v4, (a0), v0.t
+
+       vlseg8e32ff.v v4, (a0)
+       vlseg8e32ff.v v4, 0(a0)
+       vlseg8e32ff.v v4, (a0), v0.t
+
+       vlseg2e64ff.v v4, (a0)
+       vlseg2e64ff.v v4, 0(a0)
+       vlseg2e64ff.v v4, (a0), v0.t
+
+       vlseg3e64ff.v v4, (a0)
+       vlseg3e64ff.v v4, 0(a0)
+       vlseg3e64ff.v v4, (a0), v0.t
+
+       vlseg4e64ff.v v4, (a0)
+       vlseg4e64ff.v v4, 0(a0)
+       vlseg4e64ff.v v4, (a0), v0.t
+
+       vlseg5e64ff.v v4, (a0)
+       vlseg5e64ff.v v4, 0(a0)
+       vlseg5e64ff.v v4, (a0), v0.t
+
+       vlseg6e64ff.v v4, (a0)
+       vlseg6e64ff.v v4, 0(a0)
+       vlseg6e64ff.v v4, (a0), v0.t
+
+       vlseg7e64ff.v v4, (a0)
+       vlseg7e64ff.v v4, 0(a0)
+       vlseg7e64ff.v v4, (a0), v0.t
+
+       vlseg8e64ff.v v4, (a0)
+       vlseg8e64ff.v v4, 0(a0)
+       vlseg8e64ff.v v4, (a0), v0.t
+
+       vl1r.v v3, (a0)
+       vl1r.v v3, 0(a0)
+       vl1re8.v v3, (a0)
+       vl1re8.v v3, 0(a0)
+       vl1re16.v v3, (a0)
+       vl1re16.v v3, 0(a0)
+       vl1re32.v v3, (a0)
+       vl1re32.v v3, 0(a0)
+       vl1re64.v v3, (a0)
+       vl1re64.v v3, 0(a0)
+
+       vl2r.v v2, (a0)
+       vl2r.v v2, 0(a0)
+       vl2re8.v v2, (a0)
+       vl2re8.v v2, 0(a0)
+       vl2re16.v v2, (a0)
+       vl2re16.v v2, 0(a0)
+       vl2re32.v v2, (a0)
+       vl2re32.v v2, 0(a0)
+       vl2re64.v v2, (a0)
+       vl2re64.v v2, 0(a0)
+
+       vl4r.v v4, (a0)
+       vl4r.v v4, 0(a0)
+       vl4re8.v v4, (a0)
+       vl4re8.v v4, 0(a0)
+       vl4re16.v v4, (a0)
+       vl4re16.v v4, 0(a0)
+       vl4re32.v v4, (a0)
+       vl4re32.v v4, 0(a0)
+       vl4re64.v v4, (a0)
+       vl4re64.v v4, 0(a0)
+
+       vl8r.v v8, (a0)
+       vl8r.v v8, 0(a0)
+       vl8re8.v v8, (a0)
+       vl8re8.v v8, 0(a0)
+       vl8re16.v v8, (a0)
+       vl8re16.v v8, 0(a0)
+       vl8re32.v v8, (a0)
+       vl8re32.v v8, 0(a0)
+       vl8re64.v v8, (a0)
+       vl8re64.v v8, 0(a0)
+
+       vs1r.v v3, (a1)
+       vs1r.v v3, 0(a1)
+       vs2r.v v2, (a1)
+       vs2r.v v2, 0(a1)
+       vs4r.v v4, (a1)
+       vs4r.v v4, 0(a1)
+       vs8r.v v8, (a1)
+       vs8r.v v8, 0(a1)
+
+       vamoaddei8.v v4, (a1), v8, v4
+       vamoaddei8.v x0, (a1), v8, v4
+       vamoaddei8.v v4, (a1), v8, v4, v0.t
+       vamoaddei8.v x0, (a1), v8, v4, v0.t
+       vamoswapei8.v v4, (a1), v8, v4
+       vamoswapei8.v x0, (a1), v8, v4
+       vamoswapei8.v v4, (a1), v8, v4, v0.t
+       vamoswapei8.v x0, (a1), v8, v4, v0.t
+
+       vamoxorei8.v v4, (a1), v8, v4
+       vamoxorei8.v x0, (a1), v8, v4
+       vamoxorei8.v v4, (a1), v8, v4, v0.t
+       vamoxorei8.v x0, (a1), v8, v4, v0.t
+       vamoandei8.v v4, (a1), v8, v4
+       vamoandei8.v x0, (a1), v8, v4
+       vamoandei8.v v4, (a1), v8, v4, v0.t
+       vamoandei8.v x0, (a1), v8, v4, v0.t
+       vamoorei8.v v4, (a1), v8, v4
+       vamoorei8.v x0, (a1), v8, v4
+       vamoorei8.v v4, (a1), v8, v4, v0.t
+       vamoorei8.v x0, (a1), v8, v4, v0.t
+
+       vamominei8.v v4, (a1), v8, v4
+       vamominei8.v x0, (a1), v8, v4
+       vamominei8.v v4, (a1), v8, v4, v0.t
+       vamominei8.v x0, (a1), v8, v4, v0.t
+       vamomaxei8.v v4, (a1), v8, v4
+       vamomaxei8.v x0, (a1), v8, v4
+       vamomaxei8.v v4, (a1), v8, v4, v0.t
+       vamomaxei8.v x0, (a1), v8, v4, v0.t
+       vamominuei8.v v4, (a1), v8, v4
+       vamominuei8.v x0, (a1), v8, v4
+       vamominuei8.v v4, (a1), v8, v4, v0.t
+       vamominuei8.v x0, (a1), v8, v4, v0.t
+       vamomaxuei8.v v4, (a1), v8, v4
+       vamomaxuei8.v x0, (a1), v8, v4
+       vamomaxuei8.v v4, (a1), v8, v4, v0.t
+       vamomaxuei8.v x0, (a1), v8, v4, v0.t
+
+       vamoaddei8.v v4, 0(a1), v8, v4
+       vamoaddei8.v x0, 0(a1), v8, v4
+       vamoaddei8.v v4, 0(a1), v8, v4, v0.t
+       vamoaddei8.v x0, 0(a1), v8, v4, v0.t
+       vamoswapei8.v v4, 0(a1), v8, v4
+       vamoswapei8.v x0, 0(a1), v8, v4
+       vamoswapei8.v v4, 0(a1), v8, v4, v0.t
+       vamoswapei8.v x0, 0(a1), v8, v4, v0.t
+
+       vamoxorei8.v v4, 0(a1), v8, v4
+       vamoxorei8.v x0, 0(a1), v8, v4
+       vamoxorei8.v v4, 0(a1), v8, v4, v0.t
+       vamoxorei8.v x0, 0(a1), v8, v4, v0.t
+       vamoandei8.v v4, 0(a1), v8, v4
+       vamoandei8.v x0, 0(a1), v8, v4
+       vamoandei8.v v4, 0(a1), v8, v4, v0.t
+       vamoandei8.v x0, 0(a1), v8, v4, v0.t
+       vamoorei8.v v4, 0(a1), v8, v4
+       vamoorei8.v x0, 0(a1), v8, v4
+       vamoorei8.v v4, 0(a1), v8, v4, v0.t
+       vamoorei8.v x0, 0(a1), v8, v4, v0.t
+
+       vamominei8.v v4, 0(a1), v8, v4
+       vamominei8.v x0, 0(a1), v8, v4
+       vamominei8.v v4, 0(a1), v8, v4, v0.t
+       vamominei8.v x0, 0(a1), v8, v4, v0.t
+       vamomaxei8.v v4, 0(a1), v8, v4
+       vamomaxei8.v x0, 0(a1), v8, v4
+       vamomaxei8.v v4, 0(a1), v8, v4, v0.t
+       vamomaxei8.v x0, 0(a1), v8, v4, v0.t
+       vamominuei8.v v4, 0(a1), v8, v4
+       vamominuei8.v x0, 0(a1), v8, v4
+       vamominuei8.v v4, 0(a1), v8, v4, v0.t
+       vamominuei8.v x0, 0(a1), v8, v4, v0.t
+       vamomaxuei8.v v4, 0(a1), v8, v4
+       vamomaxuei8.v x0, 0(a1), v8, v4
+       vamomaxuei8.v v4, 0(a1), v8, v4, v0.t
+       vamomaxuei8.v x0, 0(a1), v8, v4, v0.t
+
+       vamoaddei16.v v4, (a1), v8, v4
+       vamoaddei16.v x0, (a1), v8, v4
+       vamoaddei16.v v4, (a1), v8, v4, v0.t
+       vamoaddei16.v x0, (a1), v8, v4, v0.t
+       vamoswapei16.v v4, (a1), v8, v4
+       vamoswapei16.v x0, (a1), v8, v4
+       vamoswapei16.v v4, (a1), v8, v4, v0.t
+       vamoswapei16.v x0, (a1), v8, v4, v0.t
+
+       vamoxorei16.v v4, (a1), v8, v4
+       vamoxorei16.v x0, (a1), v8, v4
+       vamoxorei16.v v4, (a1), v8, v4, v0.t
+       vamoxorei16.v x0, (a1), v8, v4, v0.t
+       vamoandei16.v v4, (a1), v8, v4
+       vamoandei16.v x0, (a1), v8, v4
+       vamoandei16.v v4, (a1), v8, v4, v0.t
+       vamoandei16.v x0, (a1), v8, v4, v0.t
+       vamoorei16.v v4, (a1), v8, v4
+       vamoorei16.v x0, (a1), v8, v4
+       vamoorei16.v v4, (a1), v8, v4, v0.t
+       vamoorei16.v x0, (a1), v8, v4, v0.t
+
+       vamominei16.v v4, (a1), v8, v4
+       vamominei16.v x0, (a1), v8, v4
+       vamominei16.v v4, (a1), v8, v4, v0.t
+       vamominei16.v x0, (a1), v8, v4, v0.t
+       vamomaxei16.v v4, (a1), v8, v4
+       vamomaxei16.v x0, (a1), v8, v4
+       vamomaxei16.v v4, (a1), v8, v4, v0.t
+       vamomaxei16.v x0, (a1), v8, v4, v0.t
+       vamominuei16.v v4, (a1), v8, v4
+       vamominuei16.v x0, (a1), v8, v4
+       vamominuei16.v v4, (a1), v8, v4, v0.t
+       vamominuei16.v x0, (a1), v8, v4, v0.t
+       vamomaxuei16.v v4, (a1), v8, v4
+       vamomaxuei16.v x0, (a1), v8, v4
+       vamomaxuei16.v v4, (a1), v8, v4, v0.t
+       vamomaxuei16.v x0, (a1), v8, v4, v0.t
+
+       vamoaddei16.v v4, 0(a1), v8, v4
+       vamoaddei16.v x0, 0(a1), v8, v4
+       vamoaddei16.v v4, 0(a1), v8, v4, v0.t
+       vamoaddei16.v x0, 0(a1), v8, v4, v0.t
+       vamoswapei16.v v4, 0(a1), v8, v4
+       vamoswapei16.v x0, 0(a1), v8, v4
+       vamoswapei16.v v4, 0(a1), v8, v4, v0.t
+       vamoswapei16.v x0, 0(a1), v8, v4, v0.t
+
+       vamoxorei16.v v4, 0(a1), v8, v4
+       vamoxorei16.v x0, 0(a1), v8, v4
+       vamoxorei16.v v4, 0(a1), v8, v4, v0.t
+       vamoxorei16.v x0, 0(a1), v8, v4, v0.t
+       vamoandei16.v v4, 0(a1), v8, v4
+       vamoandei16.v x0, 0(a1), v8, v4
+       vamoandei16.v v4, 0(a1), v8, v4, v0.t
+       vamoandei16.v x0, 0(a1), v8, v4, v0.t
+       vamoorei16.v v4, 0(a1), v8, v4
+       vamoorei16.v x0, 0(a1), v8, v4
+       vamoorei16.v v4, 0(a1), v8, v4, v0.t
+       vamoorei16.v x0, 0(a1), v8, v4, v0.t
+
+       vamominei16.v v4, 0(a1), v8, v4
+       vamominei16.v x0, 0(a1), v8, v4
+       vamominei16.v v4, 0(a1), v8, v4, v0.t
+       vamominei16.v x0, 0(a1), v8, v4, v0.t
+       vamomaxei16.v v4, 0(a1), v8, v4
+       vamomaxei16.v x0, 0(a1), v8, v4
+       vamomaxei16.v v4, 0(a1), v8, v4, v0.t
+       vamomaxei16.v x0, 0(a1), v8, v4, v0.t
+       vamominuei16.v v4, 0(a1), v8, v4
+       vamominuei16.v x0, 0(a1), v8, v4
+       vamominuei16.v v4, 0(a1), v8, v4, v0.t
+       vamominuei16.v x0, 0(a1), v8, v4, v0.t
+       vamomaxuei16.v v4, 0(a1), v8, v4
+       vamomaxuei16.v x0, 0(a1), v8, v4
+       vamomaxuei16.v v4, 0(a1), v8, v4, v0.t
+       vamomaxuei16.v x0, 0(a1), v8, v4, v0.t
+
+       vamoaddei32.v v4, (a1), v8, v4
+       vamoaddei32.v x0, (a1), v8, v4
+       vamoaddei32.v v4, (a1), v8, v4, v0.t
+       vamoaddei32.v x0, (a1), v8, v4, v0.t
+       vamoswapei32.v v4, (a1), v8, v4
+       vamoswapei32.v x0, (a1), v8, v4
+       vamoswapei32.v v4, (a1), v8, v4, v0.t
+       vamoswapei32.v x0, (a1), v8, v4, v0.t
+
+       vamoxorei32.v v4, (a1), v8, v4
+       vamoxorei32.v x0, (a1), v8, v4
+       vamoxorei32.v v4, (a1), v8, v4, v0.t
+       vamoxorei32.v x0, (a1), v8, v4, v0.t
+       vamoandei32.v v4, (a1), v8, v4
+       vamoandei32.v x0, (a1), v8, v4
+       vamoandei32.v v4, (a1), v8, v4, v0.t
+       vamoandei32.v x0, (a1), v8, v4, v0.t
+       vamoorei32.v v4, (a1), v8, v4
+       vamoorei32.v x0, (a1), v8, v4
+       vamoorei32.v v4, (a1), v8, v4, v0.t
+       vamoorei32.v x0, (a1), v8, v4, v0.t
+
+       vamominei32.v v4, (a1), v8, v4
+       vamominei32.v x0, (a1), v8, v4
+       vamominei32.v v4, (a1), v8, v4, v0.t
+       vamominei32.v x0, (a1), v8, v4, v0.t
+       vamomaxei32.v v4, (a1), v8, v4
+       vamomaxei32.v x0, (a1), v8, v4
+       vamomaxei32.v v4, (a1), v8, v4, v0.t
+       vamomaxei32.v x0, (a1), v8, v4, v0.t
+       vamominuei32.v v4, (a1), v8, v4
+       vamominuei32.v x0, (a1), v8, v4
+       vamominuei32.v v4, (a1), v8, v4, v0.t
+       vamominuei32.v x0, (a1), v8, v4, v0.t
+       vamomaxuei32.v v4, (a1), v8, v4
+       vamomaxuei32.v x0, (a1), v8, v4
+       vamomaxuei32.v v4, (a1), v8, v4, v0.t
+       vamomaxuei32.v x0, (a1), v8, v4, v0.t
+
+       vamoaddei32.v v4, 0(a1), v8, v4
+       vamoaddei32.v x0, 0(a1), v8, v4
+       vamoaddei32.v v4, 0(a1), v8, v4, v0.t
+       vamoaddei32.v x0, 0(a1), v8, v4, v0.t
+       vamoswapei32.v v4, 0(a1), v8, v4
+       vamoswapei32.v x0, 0(a1), v8, v4
+       vamoswapei32.v v4, 0(a1), v8, v4, v0.t
+       vamoswapei32.v x0, 0(a1), v8, v4, v0.t
+
+       vamoxorei32.v v4, 0(a1), v8, v4
+       vamoxorei32.v x0, 0(a1), v8, v4
+       vamoxorei32.v v4, 0(a1), v8, v4, v0.t
+       vamoxorei32.v x0, 0(a1), v8, v4, v0.t
+       vamoandei32.v v4, 0(a1), v8, v4
+       vamoandei32.v x0, 0(a1), v8, v4
+       vamoandei32.v v4, 0(a1), v8, v4, v0.t
+       vamoandei32.v x0, 0(a1), v8, v4, v0.t
+       vamoorei32.v v4, 0(a1), v8, v4
+       vamoorei32.v x0, 0(a1), v8, v4
+       vamoorei32.v v4, 0(a1), v8, v4, v0.t
+       vamoorei32.v x0, 0(a1), v8, v4, v0.t
+
+       vamominei32.v v4, 0(a1), v8, v4
+       vamominei32.v x0, 0(a1), v8, v4
+       vamominei32.v v4, 0(a1), v8, v4, v0.t
+       vamominei32.v x0, 0(a1), v8, v4, v0.t
+       vamomaxei32.v v4, 0(a1), v8, v4
+       vamomaxei32.v x0, 0(a1), v8, v4
+       vamomaxei32.v v4, 0(a1), v8, v4, v0.t
+       vamomaxei32.v x0, 0(a1), v8, v4, v0.t
+       vamominuei32.v v4, 0(a1), v8, v4
+       vamominuei32.v x0, 0(a1), v8, v4
+       vamominuei32.v v4, 0(a1), v8, v4, v0.t
+       vamominuei32.v x0, 0(a1), v8, v4, v0.t
+       vamomaxuei32.v v4, 0(a1), v8, v4
+       vamomaxuei32.v x0, 0(a1), v8, v4
+       vamomaxuei32.v v4, 0(a1), v8, v4, v0.t
+       vamomaxuei32.v x0, 0(a1), v8, v4, v0.t
+
+       vamoaddei64.v v4, (a1), v8, v4
+       vamoaddei64.v x0, (a1), v8, v4
+       vamoaddei64.v v4, (a1), v8, v4, v0.t
+       vamoaddei64.v x0, (a1), v8, v4, v0.t
+       vamoswapei64.v v4, (a1), v8, v4
+       vamoswapei64.v x0, (a1), v8, v4
+       vamoswapei64.v v4, (a1), v8, v4, v0.t
+       vamoswapei64.v x0, (a1), v8, v4, v0.t
+
+       vamoxorei64.v v4, (a1), v8, v4
+       vamoxorei64.v x0, (a1), v8, v4
+       vamoxorei64.v v4, (a1), v8, v4, v0.t
+       vamoxorei64.v x0, (a1), v8, v4, v0.t
+       vamoandei64.v v4, (a1), v8, v4
+       vamoandei64.v x0, (a1), v8, v4
+       vamoandei64.v v4, (a1), v8, v4, v0.t
+       vamoandei64.v x0, (a1), v8, v4, v0.t
+       vamoorei64.v v4, (a1), v8, v4
+       vamoorei64.v x0, (a1), v8, v4
+       vamoorei64.v v4, (a1), v8, v4, v0.t
+       vamoorei64.v x0, (a1), v8, v4, v0.t
+
+       vamominei64.v v4, (a1), v8, v4
+       vamominei64.v x0, (a1), v8, v4
+       vamominei64.v v4, (a1), v8, v4, v0.t
+       vamominei64.v x0, (a1), v8, v4, v0.t
+       vamomaxei64.v v4, (a1), v8, v4
+       vamomaxei64.v x0, (a1), v8, v4
+       vamomaxei64.v v4, (a1), v8, v4, v0.t
+       vamomaxei64.v x0, (a1), v8, v4, v0.t
+       vamominuei64.v v4, (a1), v8, v4
+       vamominuei64.v x0, (a1), v8, v4
+       vamominuei64.v v4, (a1), v8, v4, v0.t
+       vamominuei64.v x0, (a1), v8, v4, v0.t
+       vamomaxuei64.v v4, (a1), v8, v4
+       vamomaxuei64.v x0, (a1), v8, v4
+       vamomaxuei64.v v4, (a1), v8, v4, v0.t
+       vamomaxuei64.v x0, (a1), v8, v4, v0.t
+
+       vamoaddei64.v v4, 0(a1), v8, v4
+       vamoaddei64.v x0, 0(a1), v8, v4
+       vamoaddei64.v v4, 0(a1), v8, v4, v0.t
+       vamoaddei64.v x0, 0(a1), v8, v4, v0.t
+       vamoswapei64.v v4, 0(a1), v8, v4
+       vamoswapei64.v x0, 0(a1), v8, v4
+       vamoswapei64.v v4, 0(a1), v8, v4, v0.t
+       vamoswapei64.v x0, 0(a1), v8, v4, v0.t
+
+       vamoxorei64.v v4, 0(a1), v8, v4
+       vamoxorei64.v x0, 0(a1), v8, v4
+       vamoxorei64.v v4, 0(a1), v8, v4, v0.t
+       vamoxorei64.v x0, 0(a1), v8, v4, v0.t
+       vamoandei64.v v4, 0(a1), v8, v4
+       vamoandei64.v x0, 0(a1), v8, v4
+       vamoandei64.v v4, 0(a1), v8, v4, v0.t
+       vamoandei64.v x0, 0(a1), v8, v4, v0.t
+       vamoorei64.v v4, 0(a1), v8, v4
+       vamoorei64.v x0, 0(a1), v8, v4
+       vamoorei64.v v4, 0(a1), v8, v4, v0.t
+       vamoorei64.v x0, 0(a1), v8, v4, v0.t
+
+       vamominei64.v v4, 0(a1), v8, v4
+       vamominei64.v x0, 0(a1), v8, v4
+       vamominei64.v v4, 0(a1), v8, v4, v0.t
+       vamominei64.v x0, 0(a1), v8, v4, v0.t
+       vamomaxei64.v v4, 0(a1), v8, v4
+       vamomaxei64.v x0, 0(a1), v8, v4
+       vamomaxei64.v v4, 0(a1), v8, v4, v0.t
+       vamomaxei64.v x0, 0(a1), v8, v4, v0.t
+       vamominuei64.v v4, 0(a1), v8, v4
+       vamominuei64.v x0, 0(a1), v8, v4
+       vamominuei64.v v4, 0(a1), v8, v4, v0.t
+       vamominuei64.v x0, 0(a1), v8, v4, v0.t
+       vamomaxuei64.v v4, 0(a1), v8, v4
+       vamomaxuei64.v x0, 0(a1), v8, v4
+       vamomaxuei64.v v4, 0(a1), v8, v4, v0.t
+       vamomaxuei64.v x0, 0(a1), v8, v4, v0.t
+
+       vneg.v v4, v8
+       vneg.v v4, v8, v0.t
+
+       vadd.vv v4, v8, v12
+       vadd.vx v4, v8, a1
+       vadd.vi v4, v8, 15
+       vadd.vi v4, v8, -16
+       vadd.vv v4, v8, v12, v0.t
+       vadd.vx v4, v8, a1, v0.t
+       vadd.vi v4, v8, 15, v0.t
+       vadd.vi v4, v8, -16, v0.t
+       vsub.vv v4, v8, v12
+       vsub.vx v4, v8, a1
+       vrsub.vx v4, v8, a1
+       vrsub.vi v4, v8, 15
+       vrsub.vi v4, v8, -16
+       vsub.vv v4, v8, v12, v0.t
+       vsub.vx v4, v8, a1, v0.t
+       vrsub.vx v4, v8, a1, v0.t
+       vrsub.vi v4, v8, 15, v0.t
+       vrsub.vi v4, v8, -16, v0.t
+
+       # Aliases
+       vwcvt.x.x.v v4, v8
+       vwcvtu.x.x.v v4, v8
+       vwcvt.x.x.v v4, v8, v0.t
+       vwcvtu.x.x.v v4, v8, v0.t
+
+       vwaddu.vv v4, v8, v12
+       vwaddu.vx v4, v8, a1
+       vwaddu.vv v4, v8, v12, v0.t
+       vwaddu.vx v4, v8, a1, v0.t
+       vwsubu.vv v4, v8, v12
+       vwsubu.vx v4, v8, a1
+       vwsubu.vv v4, v8, v12, v0.t
+       vwsubu.vx v4, v8, a1, v0.t
+       vwadd.vv v4, v8, v12
+       vwadd.vx v4, v8, a1
+       vwadd.vv v4, v8, v12, v0.t
+       vwadd.vx v4, v8, a1, v0.t
+       vwsub.vv v4, v8, v12
+       vwsub.vx v4, v8, a1
+       vwsub.vv v4, v8, v12, v0.t
+       vwsub.vx v4, v8, a1, v0.t
+       vwaddu.wv v4, v8, v12
+       vwaddu.wx v4, v8, a1
+       vwaddu.wv v4, v8, v12, v0.t
+       vwaddu.wx v4, v8, a1, v0.t
+       vwsubu.wv v4, v8, v12
+       vwsubu.wx v4, v8, a1
+       vwsubu.wv v4, v8, v12, v0.t
+       vwsubu.wx v4, v8, a1, v0.t
+       vwadd.wv v4, v8, v12
+       vwadd.wx v4, v8, a1
+       vwadd.wv v4, v8, v12, v0.t
+       vwadd.wx v4, v8, a1, v0.t
+       vwsub.wv v4, v8, v12
+       vwsub.wx v4, v8, a1
+       vwsub.wv v4, v8, v12, v0.t
+       vwsub.wx v4, v8, a1, v0.t
+
+       vzext.vf2 v4, v8
+       vzext.vf2 v4, v8, v0.t
+       vsext.vf2 v4, v8
+       vsext.vf2 v4, v8, v0.t
+       vzext.vf4 v4, v8
+       vzext.vf4 v4, v8, v0.t
+       vsext.vf4 v4, v8
+       vsext.vf4 v4, v8, v0.t
+       vzext.vf8 v4, v8
+       vzext.vf8 v4, v8, v0.t
+       vsext.vf8 v4, v8
+       vsext.vf8 v4, v8, v0.t
+
+       vadc.vvm v4, v8, v12, v0
+       vadc.vxm v4, v8, a1, v0
+       vadc.vim v4, v8, 15, v0
+       vadc.vim v4, v8, -16, v0
+       vmadc.vvm v4, v8, v12, v0
+       vmadc.vxm v4, v8, a1, v0
+       vmadc.vim v4, v8, 15, v0
+       vmadc.vim v4, v8, -16, v0
+       vmadc.vv v4, v8, v12
+       vmadc.vx v4, v8, a1
+       vmadc.vi v4, v8, 15
+       vmadc.vi v4, v8, -16
+       vsbc.vvm v4, v8, v12, v0
+       vsbc.vxm v4, v8, a1, v0
+       vmsbc.vvm v4, v8, v12, v0
+       vmsbc.vxm v4, v8, a1, v0
+       vmsbc.vv v4, v8, v12
+       vmsbc.vx v4, v8, a1
+
+       # Aliases
+       vnot.v v4, v8
+       vnot.v v4, v8, v0.t
+
+       vand.vv v4, v8, v12
+       vand.vx v4, v8, a1
+       vand.vi v4, v8, 15
+       vand.vi v4, v8, -16
+       vand.vv v4, v8, v12, v0.t
+       vand.vx v4, v8, a1, v0.t
+       vand.vi v4, v8, 15, v0.t
+       vand.vi v4, v8, -16, v0.t
+       vor.vv v4, v8, v12
+       vor.vx v4, v8, a1
+       vor.vi v4, v8, 15
+       vor.vi v4, v8, -16
+       vor.vv v4, v8, v12, v0.t
+       vor.vx v4, v8, a1, v0.t
+       vor.vi v4, v8, 15, v0.t
+       vor.vi v4, v8, -16, v0.t
+       vxor.vv v4, v8, v12
+       vxor.vx v4, v8, a1
+       vxor.vi v4, v8, 15
+       vxor.vi v4, v8, -16
+       vxor.vv v4, v8, v12, v0.t
+       vxor.vx v4, v8, a1, v0.t
+       vxor.vi v4, v8, 15, v0.t
+       vxor.vi v4, v8, -16, v0.t
+
+       vsll.vv v4, v8, v12
+       vsll.vx v4, v8, a1
+       vsll.vi v4, v8, 1
+       vsll.vi v4, v8, 31
+       vsll.vv v4, v8, v12, v0.t
+       vsll.vx v4, v8, a1, v0.t
+       vsll.vi v4, v8, 1, v0.t
+       vsll.vi v4, v8, 31, v0.t
+       vsrl.vv v4, v8, v12
+       vsrl.vx v4, v8, a1
+       vsrl.vi v4, v8, 1
+       vsrl.vi v4, v8, 31
+       vsrl.vv v4, v8, v12, v0.t
+       vsrl.vx v4, v8, a1, v0.t
+       vsrl.vi v4, v8, 1, v0.t
+       vsrl.vi v4, v8, 31, v0.t
+       vsra.vv v4, v8, v12
+       vsra.vx v4, v8, a1
+       vsra.vi v4, v8, 1
+       vsra.vi v4, v8, 31
+       vsra.vv v4, v8, v12, v0.t
+       vsra.vx v4, v8, a1, v0.t
+       vsra.vi v4, v8, 1, v0.t
+       vsra.vi v4, v8, 31, v0.t
+
+       # Aliases
+       vncvt.x.x.w v4, v8
+       vncvt.x.x.w v4, v8, v0.t
+
+       vnsrl.wv v4, v8, v12
+       vnsrl.wx v4, v8, a1
+       vnsrl.wi v4, v8, 1
+       vnsrl.wi v4, v8, 31
+       vnsrl.wv v4, v8, v12, v0.t
+       vnsrl.wx v4, v8, a1, v0.t
+       vnsrl.wi v4, v8, 1, v0.t
+       vnsrl.wi v4, v8, 31, v0.t
+       vnsra.wv v4, v8, v12
+       vnsra.wx v4, v8, a1
+       vnsra.wi v4, v8, 1
+       vnsra.wi v4, v8, 31
+       vnsra.wv v4, v8, v12, v0.t
+       vnsra.wx v4, v8, a1, v0.t
+       vnsra.wi v4, v8, 1, v0.t
+       vnsra.wi v4, v8, 31, v0.t
+
+       # Aliases
+       vmsgt.vv v4, v8, v12
+       vmsgtu.vv v4, v8, v12
+       vmsge.vv v4, v8, v12
+       vmsgeu.vv v4, v8, v12
+       vmsgt.vv v4, v8, v12, v0.t
+       vmsgtu.vv v4, v8, v12, v0.t
+       vmsge.vv v4, v8, v12, v0.t
+       vmsgeu.vv v4, v8, v12, v0.t
+       vmslt.vi v4, v8, 16
+       vmslt.vi v4, v8, -15
+       vmsltu.vi v4, v8, 16
+       vmsltu.vi v4, v8, -15
+       vmsge.vi v4, v8, 16
+       vmsge.vi v4, v8, -15
+       vmsgeu.vi v4, v8, 16
+       vmsgeu.vi v4, v8, -15
+       vmslt.vi v4, v8, 16, v0.t
+       vmslt.vi v4, v8, -15, v0.t
+       vmsltu.vi v4, v8, 16, v0.t
+       vmsltu.vi v4, v8, -15, v0.t
+       vmsge.vi v4, v8, 16, v0.t
+       vmsge.vi v4, v8, -15, v0.t
+       vmsgeu.vi v4, v8, 16, v0.t
+       vmsgeu.vi v4, v8, -15, v0.t
+
+       vmseq.vv v4, v8, v12
+       vmseq.vx v4, v8, a1
+       vmseq.vi v4, v8, 15
+       vmseq.vi v4, v8, -16
+       vmseq.vv v4, v8, v12, v0.t
+       vmseq.vx v4, v8, a1, v0.t
+       vmseq.vi v4, v8, 15, v0.t
+       vmseq.vi v4, v8, -16, v0.t
+       vmsne.vv v4, v8, v12
+       vmsne.vx v4, v8, a1
+       vmsne.vi v4, v8, 15
+       vmsne.vi v4, v8, -16
+       vmsne.vv v4, v8, v12, v0.t
+       vmsne.vx v4, v8, a1, v0.t
+       vmsne.vi v4, v8, 15, v0.t
+       vmsne.vi v4, v8, -16, v0.t
+       vmsltu.vv v4, v8, v12
+       vmsltu.vx v4, v8, a1
+       vmsltu.vv v4, v8, v12, v0.t
+       vmsltu.vx v4, v8, a1, v0.t
+       vmslt.vv v4, v8, v12
+       vmslt.vx v4, v8, a1
+       vmslt.vv v4, v8, v12, v0.t
+       vmslt.vx v4, v8, a1, v0.t
+       vmsleu.vv v4, v8, v12
+       vmsleu.vx v4, v8, a1
+       vmsleu.vi v4, v8, 15
+       vmsleu.vi v4, v8, -16
+       vmsleu.vv v4, v8, v12, v0.t
+       vmsleu.vx v4, v8, a1, v0.t
+       vmsleu.vi v4, v8, 15, v0.t
+       vmsleu.vi v4, v8, -16, v0.t
+       vmsle.vv v4, v8, v12
+       vmsle.vx v4, v8, a1
+       vmsle.vi v4, v8, 15
+       vmsle.vi v4, v8, -16
+       vmsle.vv v4, v8, v12, v0.t
+       vmsle.vx v4, v8, a1, v0.t
+       vmsle.vi v4, v8, 15, v0.t
+       vmsle.vi v4, v8, -16, v0.t
+       vmsgtu.vx v4, v8, a1
+       vmsgtu.vi v4, v8, 15
+       vmsgtu.vi v4, v8, -16
+       vmsgtu.vx v4, v8, a1, v0.t
+       vmsgtu.vi v4, v8, 15, v0.t
+       vmsgtu.vi v4, v8, -16, v0.t
+       vmsgt.vx v4, v8, a1
+       vmsgt.vi v4, v8, 15
+       vmsgt.vi v4, v8, -16
+       vmsgt.vx v4, v8, a1, v0.t
+       vmsgt.vi v4, v8, 15, v0.t
+       vmsgt.vi v4, v8, -16, v0.t
+
+       vminu.vv v4, v8, v12
+       vminu.vx v4, v8, a1
+       vminu.vv v4, v8, v12, v0.t
+       vminu.vx v4, v8, a1, v0.t
+       vmin.vv v4, v8, v12
+       vmin.vx v4, v8, a1
+       vmin.vv v4, v8, v12, v0.t
+       vmin.vx v4, v8, a1, v0.t
+       vmaxu.vv v4, v8, v12
+       vmaxu.vx v4, v8, a1
+       vmaxu.vv v4, v8, v12, v0.t
+       vmaxu.vx v4, v8, a1, v0.t
+       vmax.vv v4, v8, v12
+       vmax.vx v4, v8, a1
+       vmax.vv v4, v8, v12, v0.t
+       vmax.vx v4, v8, a1, v0.t
+
+       vmul.vv v4, v8, v12
+       vmul.vx v4, v8, a1
+       vmul.vv v4, v8, v12, v0.t
+       vmul.vx v4, v8, a1, v0.t
+       vmulh.vv v4, v8, v12
+       vmulh.vx v4, v8, a1
+       vmulh.vv v4, v8, v12, v0.t
+       vmulh.vx v4, v8, a1, v0.t
+       vmulhu.vv v4, v8, v12
+       vmulhu.vx v4, v8, a1
+       vmulhu.vv v4, v8, v12, v0.t
+       vmulhu.vx v4, v8, a1, v0.t
+       vmulhsu.vv v4, v8, v12
+       vmulhsu.vx v4, v8, a1
+       vmulhsu.vv v4, v8, v12, v0.t
+       vmulhsu.vx v4, v8, a1, v0.t
+
+       vwmul.vv v4, v8, v12
+       vwmul.vx v4, v8, a1
+       vwmul.vv v4, v8, v12, v0.t
+       vwmul.vx v4, v8, a1, v0.t
+       vwmulu.vv v4, v8, v12
+       vwmulu.vx v4, v8, a1
+       vwmulu.vv v4, v8, v12, v0.t
+       vwmulu.vx v4, v8, a1, v0.t
+       vwmulsu.vv v4, v8, v12
+       vwmulsu.vx v4, v8, a1
+       vwmulsu.vv v4, v8, v12, v0.t
+       vwmulsu.vx v4, v8, a1, v0.t
+
+       vmacc.vv v4, v12, v8
+       vmacc.vx v4, a1, v8
+       vmacc.vv v4, v12, v8, v0.t
+       vmacc.vx v4, a1, v8, v0.t
+       vnmsac.vv v4, v12, v8
+       vnmsac.vx v4, a1, v8
+       vnmsac.vv v4, v12, v8, v0.t
+       vnmsac.vx v4, a1, v8, v0.t
+       vmadd.vv v4, v12, v8
+       vmadd.vx v4, a1, v8
+       vmadd.vv v4, v12, v8, v0.t
+       vmadd.vx v4, a1, v8, v0.t
+       vnmsub.vv v4, v12, v8
+       vnmsub.vx v4, a1, v8
+       vnmsub.vv v4, v12, v8, v0.t
+       vnmsub.vx v4, a1, v8, v0.t
+
+       vwmaccu.vv v4, v12, v8
+       vwmaccu.vx v4, a1, v8
+       vwmaccu.vv v4, v12, v8, v0.t
+       vwmaccu.vx v4, a1, v8, v0.t
+       vwmacc.vv v4, v12, v8
+       vwmacc.vx v4, a1, v8
+       vwmacc.vv v4, v12, v8, v0.t
+       vwmacc.vx v4, a1, v8, v0.t
+       vwmaccsu.vv v4, v12, v8
+       vwmaccsu.vx v4, a1, v8
+       vwmaccsu.vv v4, v12, v8, v0.t
+       vwmaccsu.vx v4, a1, v8, v0.t
+       vwmaccus.vx v4, a1, v8
+       vwmaccus.vx v4, a1, v8, v0.t
+
+       vdivu.vv v4, v8, v12
+       vdivu.vx v4, v8, a1
+       vdivu.vv v4, v8, v12, v0.t
+       vdivu.vx v4, v8, a1, v0.t
+       vdiv.vv v4, v8, v12
+       vdiv.vx v4, v8, a1
+       vdiv.vv v4, v8, v12, v0.t
+       vdiv.vx v4, v8, a1, v0.t
+       vremu.vv v4, v8, v12
+       vremu.vx v4, v8, a1
+       vremu.vv v4, v8, v12, v0.t
+       vremu.vx v4, v8, a1, v0.t
+       vrem.vv v4, v8, v12
+       vrem.vx v4, v8, a1
+       vrem.vv v4, v8, v12, v0.t
+       vrem.vx v4, v8, a1, v0.t
+
+       vmerge.vvm v4, v8, v12, v0
+       vmerge.vxm v4, v8, a1, v0
+       vmerge.vim v4, v8, 15, v0
+       vmerge.vim v4, v8, -16, v0
+
+       vmv.v.v v8, v12
+       vmv.v.x v8, a1
+       vmv.v.i v8, 15
+       vmv.v.i v8, -16
+
+       vsaddu.vv v4, v8, v12
+       vsaddu.vx v4, v8, a1
+       vsaddu.vi v4, v8, 15
+       vsaddu.vi v4, v8, -16
+       vsaddu.vv v4, v8, v12, v0.t
+       vsaddu.vx v4, v8, a1, v0.t
+       vsaddu.vi v4, v8, 15, v0.t
+       vsaddu.vi v4, v8, -16, v0.t
+       vsadd.vv v4, v8, v12
+       vsadd.vx v4, v8, a1
+       vsadd.vi v4, v8, 15
+       vsadd.vi v4, v8, -16
+       vsadd.vv v4, v8, v12, v0.t
+       vsadd.vx v4, v8, a1, v0.t
+       vsadd.vi v4, v8, 15, v0.t
+       vsadd.vi v4, v8, -16, v0.t
+       vssubu.vv v4, v8, v12
+       vssubu.vx v4, v8, a1
+       vssubu.vv v4, v8, v12, v0.t
+       vssubu.vx v4, v8, a1, v0.t
+       vssub.vv v4, v8, v12
+       vssub.vx v4, v8, a1
+       vssub.vv v4, v8, v12, v0.t
+       vssub.vx v4, v8, a1, v0.t
+
+       vaaddu.vv v4, v8, v12
+       vaaddu.vx v4, v8, a1
+       vaaddu.vv v4, v8, v12, v0.t
+       vaaddu.vx v4, v8, a1, v0.t
+       vaadd.vv v4, v8, v12
+       vaadd.vx v4, v8, a1
+       vaadd.vv v4, v8, v12, v0.t
+       vaadd.vx v4, v8, a1, v0.t
+       vasubu.vv v4, v8, v12
+       vasubu.vx v4, v8, a1
+       vasubu.vv v4, v8, v12, v0.t
+       vasubu.vx v4, v8, a1, v0.t
+       vasub.vv v4, v8, v12
+       vasub.vx v4, v8, a1
+       vasub.vv v4, v8, v12, v0.t
+       vasub.vx v4, v8, a1, v0.t
+
+       vsmul.vv v4, v8, v12
+       vsmul.vx v4, v8, a1
+       vsmul.vv v4, v8, v12, v0.t
+       vsmul.vx v4, v8, a1, v0.t
+
+       vssrl.vv v4, v8, v12
+       vssrl.vx v4, v8, a1
+       vssrl.vi v4, v8, 1
+       vssrl.vi v4, v8, 31
+       vssrl.vv v4, v8, v12, v0.t
+       vssrl.vx v4, v8, a1, v0.t
+       vssrl.vi v4, v8, 1, v0.t
+       vssrl.vi v4, v8, 31, v0.t
+       vssra.vv v4, v8, v12
+       vssra.vx v4, v8, a1
+       vssra.vi v4, v8, 1
+       vssra.vi v4, v8, 31
+       vssra.vv v4, v8, v12, v0.t
+       vssra.vx v4, v8, a1, v0.t
+       vssra.vi v4, v8, 1, v0.t
+       vssra.vi v4, v8, 31, v0.t
+
+       vnclipu.wv v4, v8, v12
+       vnclipu.wx v4, v8, a1
+       vnclipu.wi v4, v8, 1
+       vnclipu.wi v4, v8, 31
+       vnclipu.wv v4, v8, v12, v0.t
+       vnclipu.wx v4, v8, a1, v0.t
+       vnclipu.wi v4, v8, 1, v0.t
+       vnclipu.wi v4, v8, 31, v0.t
+       vnclip.wv v4, v8, v12
+       vnclip.wx v4, v8, a1
+       vnclip.wi v4, v8, 1
+       vnclip.wi v4, v8, 31
+       vnclip.wv v4, v8, v12, v0.t
+       vnclip.wx v4, v8, a1, v0.t
+       vnclip.wi v4, v8, 1, v0.t
+       vnclip.wi v4, v8, 31, v0.t
+
+       vfadd.vv v4, v8, v12
+       vfadd.vf v4, v8, fa2
+       vfadd.vv v4, v8, v12, v0.t
+       vfadd.vf v4, v8, fa2, v0.t
+       vfsub.vv v4, v8, v12
+       vfsub.vf v4, v8, fa2
+       vfsub.vv v4, v8, v12, v0.t
+       vfsub.vf v4, v8, fa2, v0.t
+       vfrsub.vf v4, v8, fa2
+       vfrsub.vf v4, v8, fa2, v0.t
+
+       vfwadd.vv v4, v8, v12
+       vfwadd.vf v4, v8, fa2
+       vfwadd.vv v4, v8, v12, v0.t
+       vfwadd.vf v4, v8, fa2, v0.t
+       vfwsub.vv v4, v8, v12
+       vfwsub.vf v4, v8, fa2
+       vfwsub.vv v4, v8, v12, v0.t
+       vfwsub.vf v4, v8, fa2, v0.t
+       vfwadd.wv v4, v8, v12
+       vfwadd.wf v4, v8, fa2
+       vfwadd.wv v4, v8, v12, v0.t
+       vfwadd.wf v4, v8, fa2, v0.t
+       vfwsub.wv v4, v8, v12
+       vfwsub.wf v4, v8, fa2
+       vfwsub.wv v4, v8, v12, v0.t
+       vfwsub.wf v4, v8, fa2, v0.t
+
+       vfmul.vv v4, v8, v12
+       vfmul.vf v4, v8, fa2
+       vfmul.vv v4, v8, v12, v0.t
+       vfmul.vf v4, v8, fa2, v0.t
+       vfdiv.vv v4, v8, v12
+       vfdiv.vf v4, v8, fa2
+       vfdiv.vv v4, v8, v12, v0.t
+       vfdiv.vf v4, v8, fa2, v0.t
+       vfrdiv.vf v4, v8, fa2
+       vfrdiv.vf v4, v8, fa2, v0.t
+
+       vfwmul.vv v4, v8, v12
+       vfwmul.vf v4, v8, fa2
+       vfwmul.vv v4, v8, v12, v0.t
+       vfwmul.vf v4, v8, fa2, v0.t
+
+       vfmadd.vv v4, v12, v8
+       vfmadd.vf v4, fa2, v8
+       vfnmadd.vv v4, v12, v8
+       vfnmadd.vf v4, fa2, v8
+       vfmsub.vv v4, v12, v8
+       vfmsub.vf v4, fa2, v8
+       vfnmsub.vv v4, v12, v8
+       vfnmsub.vf v4, fa2, v8
+       vfmadd.vv v4, v12, v8, v0.t
+       vfmadd.vf v4, fa2, v8, v0.t
+       vfnmadd.vv v4, v12, v8, v0.t
+       vfnmadd.vf v4, fa2, v8, v0.t
+       vfmsub.vv v4, v12, v8, v0.t
+       vfmsub.vf v4, fa2, v8, v0.t
+       vfnmsub.vv v4, v12, v8, v0.t
+       vfnmsub.vf v4, fa2, v8, v0.t
+       vfmacc.vv v4, v12, v8
+       vfmacc.vf v4, fa2, v8
+       vfnmacc.vv v4, v12, v8
+       vfnmacc.vf v4, fa2, v8
+       vfmsac.vv v4, v12, v8
+       vfmsac.vf v4, fa2, v8
+       vfnmsac.vv v4, v12, v8
+       vfnmsac.vf v4, fa2, v8
+       vfmacc.vv v4, v12, v8, v0.t
+       vfmacc.vf v4, fa2, v8, v0.t
+       vfnmacc.vv v4, v12, v8, v0.t
+       vfnmacc.vf v4, fa2, v8, v0.t
+       vfmsac.vv v4, v12, v8, v0.t
+       vfmsac.vf v4, fa2, v8, v0.t
+       vfnmsac.vv v4, v12, v8, v0.t
+       vfnmsac.vf v4, fa2, v8, v0.t
+
+       vfwmacc.vv v4, v12, v8
+       vfwmacc.vf v4, fa2, v8
+       vfwnmacc.vv v4, v12, v8
+       vfwnmacc.vf v4, fa2, v8
+       vfwmsac.vv v4, v12, v8
+       vfwmsac.vf v4, fa2, v8
+       vfwnmsac.vv v4, v12, v8
+       vfwnmsac.vf v4, fa2, v8
+       vfwmacc.vv v4, v12, v8, v0.t
+       vfwmacc.vf v4, fa2, v8, v0.t
+       vfwnmacc.vv v4, v12, v8, v0.t
+       vfwnmacc.vf v4, fa2, v8, v0.t
+       vfwmsac.vv v4, v12, v8, v0.t
+       vfwmsac.vf v4, fa2, v8, v0.t
+       vfwnmsac.vv v4, v12, v8, v0.t
+       vfwnmsac.vf v4, fa2, v8, v0.t
+
+       vfsqrt.v v4, v8
+       vfsqrt.v v4, v8, v0.t
+       vfrsqrte7.v v4, v8
+       vfrsqrte7.v v4, v8, v0.t
+       vfrsqrt7.v v4, v8
+       vfrsqrt7.v v4, v8, v0.t
+       vfrece7.v v4, v8
+       vfrece7.v v4, v8, v0.t
+       vfrec7.v v4, v8
+       vfrec7.v v4, v8, v0.t
+       vfclass.v v4, v8
+       vfclass.v v4, v8, v0.t
+
+       vfmin.vv v4, v8, v12
+       vfmin.vf v4, v8, fa2
+       vfmax.vv v4, v8, v12
+       vfmax.vf v4, v8, fa2
+       vfmin.vv v4, v8, v12, v0.t
+       vfmin.vf v4, v8, fa2, v0.t
+       vfmax.vv v4, v8, v12, v0.t
+       vfmax.vf v4, v8, fa2, v0.t
+
+       vfneg.v v4, v8
+       vfneg.v v4, v8, v0.t
+
+       vfsgnj.vv v4, v8, v12
+       vfsgnj.vf v4, v8, fa2
+       vfsgnjn.vv v4, v8, v12
+       vfsgnjn.vf v4, v8, fa2
+       vfsgnjx.vv v4, v8, v12
+       vfsgnjx.vf v4, v8, fa2
+       vfsgnj.vv v4, v8, v12, v0.t
+       vfsgnj.vf v4, v8, fa2, v0.t
+       vfsgnjn.vv v4, v8, v12, v0.t
+       vfsgnjn.vf v4, v8, fa2, v0.t
+       vfsgnjx.vv v4, v8, v12, v0.t
+       vfsgnjx.vf v4, v8, fa2, v0.t
+
+       # Aliases
+       vmfgt.vv v4, v8, v12
+       vmfge.vv v4, v8, v12
+       vmfgt.vv v4, v8, v12, v0.t
+       vmfge.vv v4, v8, v12, v0.t
+
+       vmfeq.vv v4, v8, v12
+       vmfeq.vf v4, v8, fa2
+       vmfne.vv v4, v8, v12
+       vmfne.vf v4, v8, fa2
+       vmflt.vv v4, v8, v12
+       vmflt.vf v4, v8, fa2
+       vmfle.vv v4, v8, v12
+       vmfle.vf v4, v8, fa2
+       vmfgt.vf v4, v8, fa2
+       vmfge.vf v4, v8, fa2
+       vmfeq.vv v4, v8, v12, v0.t
+       vmfeq.vf v4, v8, fa2, v0.t
+       vmfne.vv v4, v8, v12, v0.t
+       vmfne.vf v4, v8, fa2, v0.t
+       vmflt.vv v4, v8, v12, v0.t
+       vmflt.vf v4, v8, fa2, v0.t
+       vmfle.vv v4, v8, v12, v0.t
+       vmfle.vf v4, v8, fa2, v0.t
+       vmfgt.vf v4, v8, fa2, v0.t
+       vmfge.vf v4, v8, fa2, v0.t
+
+       vfmerge.vfm v4, v8, fa2, v0
+       vfmv.v.f v4, fa1
+
+       vfcvt.xu.f.v v4, v8
+       vfcvt.x.f.v v4, v8
+       vfcvt.rtz.xu.f.v v4, v8
+       vfcvt.rtz.x.f.v v4, v8
+       vfcvt.f.xu.v v4, v8
+       vfcvt.f.x.v v4, v8
+       vfcvt.xu.f.v v4, v8, v0.t
+       vfcvt.x.f.v v4, v8, v0.t
+       vfcvt.rtz.xu.f.v v4, v8, v0.t
+       vfcvt.rtz.x.f.v v4, v8, v0.t
+       vfcvt.f.xu.v v4, v8, v0.t
+       vfcvt.f.x.v v4, v8, v0.t
+
+       vfwcvt.xu.f.v v4, v8
+       vfwcvt.x.f.v v4, v8
+       vfwcvt.rtz.xu.f.v v4, v8
+       vfwcvt.rtz.x.f.v v4, v8
+       vfwcvt.f.xu.v v4, v8
+       vfwcvt.f.x.v v4, v8
+       vfwcvt.f.f.v v4, v8
+       vfwcvt.xu.f.v v4, v8, v0.t
+       vfwcvt.x.f.v v4, v8, v0.t
+       vfwcvt.rtz.xu.f.v v4, v8, v0.t
+       vfwcvt.rtz.x.f.v v4, v8, v0.t
+       vfwcvt.f.xu.v v4, v8, v0.t
+       vfwcvt.f.x.v v4, v8, v0.t
+       vfwcvt.f.f.v v4, v8, v0.t
+
+       vfncvt.xu.f.w v4, v8
+       vfncvt.x.f.w v4, v8
+       vfncvt.rtz.xu.f.w v4, v8
+       vfncvt.rtz.x.f.w v4, v8
+       vfncvt.f.xu.w v4, v8
+       vfncvt.f.x.w v4, v8
+       vfncvt.f.f.w v4, v8
+       vfncvt.rod.f.f.w v4, v8
+       vfncvt.xu.f.w v4, v8, v0.t
+       vfncvt.x.f.w v4, v8, v0.t
+       vfncvt.rtz.xu.f.w v4, v8, v0.t
+       vfncvt.rtz.x.f.w v4, v8, v0.t
+       vfncvt.f.xu.w v4, v8, v0.t
+       vfncvt.f.x.w v4, v8, v0.t
+       vfncvt.f.f.w v4, v8, v0.t
+       vfncvt.rod.f.f.w v4, v8, v0.t
+
+       vredsum.vs v4, v8, v12
+       vredmaxu.vs v4, v8, v8
+       vredmax.vs v4, v8, v8
+       vredminu.vs v4, v8, v8
+       vredmin.vs v4, v8, v8
+       vredand.vs v4, v8, v12
+       vredor.vs v4, v8, v12
+       vredxor.vs v4, v8, v12
+       vredsum.vs v4, v8, v12, v0.t
+       vredmaxu.vs v4, v8, v8, v0.t
+       vredmax.vs v4, v8, v8, v0.t
+       vredminu.vs v4, v8, v8, v0.t
+       vredmin.vs v4, v8, v8, v0.t
+       vredand.vs v4, v8, v12, v0.t
+       vredor.vs v4, v8, v12, v0.t
+       vredxor.vs v4, v8, v12, v0.t
+
+       vwredsumu.vs v4, v8, v12
+       vwredsum.vs v4, v8, v12
+       vwredsumu.vs v4, v8, v12, v0.t
+       vwredsum.vs v4, v8, v12, v0.t
+
+       vfredosum.vs v4, v8, v12
+       vfredsum.vs v4, v8, v12
+       vfredmax.vs v4, v8, v12
+       vfredmin.vs v4, v8, v12
+       vfredosum.vs v4, v8, v12, v0.t
+       vfredsum.vs v4, v8, v12, v0.t
+       vfredmax.vs v4, v8, v12, v0.t
+       vfredmin.vs v4, v8, v12, v0.t
+
+       vfwredosum.vs v4, v8, v12
+       vfwredsum.vs v4, v8, v12
+       vfwredosum.vs v4, v8, v12, v0.t
+       vfwredsum.vs v4, v8, v12, v0.t
+
+       # Aliases
+       vmcpy.m v4, v8
+       vmmv.m v4, v8
+       vmclr.m v4
+       vmset.m v4
+       vmnot.m v4, v8
+
+       vmand.mm v4, v8, v12
+       vmnand.mm v4, v8, v12
+       vmandnot.mm v4, v8, v12
+       vmxor.mm v4, v8, v12
+       vmor.mm v4, v8, v12
+       vmnor.mm v4, v8, v12
+       vmornot.mm v4, v8, v12
+       vmxnor.mm v4, v8, v12
+
+       vpopc.m a0, v12
+       vfirst.m a0, v12
+       vmsbf.m v4, v8
+       vmsif.m v4, v8
+       vmsof.m v4, v8
+       viota.m v4, v8
+       vid.v v4
+       vpopc.m a0, v12, v0.t
+       vfirst.m a0, v12, v0.t
+       vmsbf.m v4, v8, v0.t
+       vmsif.m v4, v8, v0.t
+       vmsof.m v4, v8, v0.t
+       viota.m v4, v8, v0.t
+       vid.v v4, v0.t
+
+       vmv.x.s a0, v12
+       vmv.s.x v4, a0
+
+       vfmv.f.s fa0, v8
+       vfmv.s.f v4, fa1
+
+       vslideup.vx v4, v8, a1
+       vslideup.vi v4, v8, 0
+       vslideup.vi v4, v8, 31
+       vslidedown.vx v4, v8, a1
+       vslidedown.vi v4, v8, 0
+       vslidedown.vi v4, v8, 31
+       vslideup.vx v4, v8, a1, v0.t
+       vslideup.vi v4, v8, 0, v0.t
+       vslideup.vi v4, v8, 31, v0.t
+       vslidedown.vx v4, v8, a1, v0.t
+       vslidedown.vi v4, v8, 0, v0.t
+       vslidedown.vi v4, v8, 31, v0.t
+
+       vslide1up.vx v4, v8, a1
+       vslide1down.vx v4, v8, a1
+       vslide1up.vx v4, v8, a1, v0.t
+       vslide1down.vx v4, v8, a1, v0.t
+
+       vfslide1up.vf v4, v8, fa1
+       vfslide1down.vf v4, v8, fa1
+       vfslide1up.vf v4, v8, fa1, v0.t
+       vfslide1down.vf v4, v8, fa1, v0.t
+
+       vrgather.vv v4, v8, v12
+       vrgather.vx v4, v8, a1
+       vrgather.vi v4, v8, 0
+       vrgather.vi v4, v8, 31
+       vrgather.vv v4, v8, v12, v0.t
+       vrgather.vx v4, v8, a1, v0.t
+       vrgather.vi v4, v8, 0, v0.t
+       vrgather.vi v4, v8, 31, v0.t
+
+       vrgatherei16.vv v4, v8, v12
+       vrgatherei16.vv v4, v8, v12, v0.t
+
+       vcompress.vm v4, v8, v12
+
+       vmv1r.v v1, v2
+       vmv2r.v v2, v4
+       vmv4r.v v4, v8
+       vmv8r.v v0, v8
index c9c292aab5de57cb2e4303ac6d1b6b670581fe5b..cc7c56c74a543de4d53dc12d649815e2ef0b828f 100644 (file)
 
 #ifndef RISCV_EXTENDED_ENCODING_H
 #define RISCV_EXTENDED_ENCODING_H
+/* RVV instruction.  */
+#define MATCH_VSETVL           0x80007057
+#define MASK_VSETVL            0xfe00707f
+#define MATCH_VSETIVLI         0xc0007057
+#define MASK_VSETIVLI          0xc000707f
+#define MATCH_VSETVLI          0x00007057
+#define MASK_VSETVLI           0x8000707f
+#define MATCH_VLE1V            0x02b00007
+#define MASK_VLE1V             0xfff0707f
+#define MATCH_VSE1V            0x02b00027
+#define MASK_VSE1V             0xfff0707f
+#define MATCH_VLE8V            0x00000007
+#define MASK_VLE8V             0xfdf0707f
+#define MATCH_VLE16V           0x00005007
+#define MASK_VLE16V            0xfdf0707f
+#define MATCH_VLE32V           0x00006007
+#define MASK_VLE32V            0xfdf0707f
+#define MATCH_VLE64V           0x00007007
+#define MASK_VLE64V            0xfdf0707f
+#define MATCH_VSE8V            0x00000027
+#define MASK_VSE8V             0xfdf0707f
+#define MATCH_VSE16V           0x00005027
+#define MASK_VSE16V            0xfdf0707f
+#define MATCH_VSE32V           0x00006027
+#define MASK_VSE32V            0xfdf0707f
+#define MATCH_VSE64V           0x00007027
+#define MASK_VSE64V            0xfdf0707f
+#define MATCH_VLSE8V           0x08000007
+#define MASK_VLSE8V            0xfc00707f
+#define MATCH_VLSE16V          0x08005007
+#define MASK_VLSE16V           0xfc00707f
+#define MATCH_VLSE32V          0x08006007
+#define MASK_VLSE32V           0xfc00707f
+#define MATCH_VLSE64V          0x08007007
+#define MASK_VLSE64V           0xfc00707f
+#define MATCH_VSSE8V           0x08000027
+#define MASK_VSSE8V            0xfc00707f
+#define MATCH_VSSE16V          0x08005027
+#define MASK_VSSE16V           0xfc00707f
+#define MATCH_VSSE32V          0x08006027
+#define MASK_VSSE32V           0xfc00707f
+#define MATCH_VSSE64V          0x08007027
+#define MASK_VSSE64V           0xfc00707f
+#define MATCH_VLOXEI8V         0x0c000007
+#define MASK_VLOXEI8V          0xfc00707f
+#define MATCH_VLOXEI16V                0x0c005007
+#define MASK_VLOXEI16V         0xfc00707f
+#define MATCH_VLOXEI32V                0x0c006007
+#define MASK_VLOXEI32V         0xfc00707f
+#define MATCH_VLOXEI64V                0x0c007007
+#define MASK_VLOXEI64V         0xfc00707f
+#define MATCH_VSOXEI8V         0x0c000027
+#define MASK_VSOXEI8V          0xfc00707f
+#define MATCH_VSOXEI16V                0x0c005027
+#define MASK_VSOXEI16V         0xfc00707f
+#define MATCH_VSOXEI32V                0x0c006027
+#define MASK_VSOXEI32V         0xfc00707f
+#define MATCH_VSOXEI64V                0x0c007027
+#define MASK_VSOXEI64V         0xfc00707f
+#define MATCH_VLUXEI8V         0x04000007
+#define MASK_VLUXEI8V          0xfc00707f
+#define MATCH_VLUXEI16V                0x04005007
+#define MASK_VLUXEI16V         0xfc00707f
+#define MATCH_VLUXEI32V                0x04006007
+#define MASK_VLUXEI32V         0xfc00707f
+#define MATCH_VLUXEI64V                0x04007007
+#define MASK_VLUXEI64V         0xfc00707f
+#define MATCH_VSUXEI8V         0x04000027
+#define MASK_VSUXEI8V          0xfc00707f
+#define MATCH_VSUXEI16V                0x04005027
+#define MASK_VSUXEI16V         0xfc00707f
+#define MATCH_VSUXEI32V                0x04006027
+#define MASK_VSUXEI32V         0xfc00707f
+#define MATCH_VSUXEI64V                0x04007027
+#define MASK_VSUXEI64V         0xfc00707f
+#define MATCH_VLE8FFV          0x01000007
+#define MASK_VLE8FFV           0xfdf0707f
+#define MATCH_VLE16FFV         0x01005007
+#define MASK_VLE16FFV          0xfdf0707f
+#define MATCH_VLE32FFV         0x01006007
+#define MASK_VLE32FFV          0xfdf0707f
+#define MATCH_VLE64FFV         0x01007007
+#define MASK_VLE64FFV          0xfdf0707f
+#define MATCH_VLSEG2E8V                0x20000007
+#define MASK_VLSEG2E8V         0xfdf0707f
+#define MATCH_VSSEG2E8V                0x20000027
+#define MASK_VSSEG2E8V         0xfdf0707f
+#define MATCH_VLSEG3E8V                0x40000007
+#define MASK_VLSEG3E8V         0xfdf0707f
+#define MATCH_VSSEG3E8V                0x40000027
+#define MASK_VSSEG3E8V         0xfdf0707f
+#define MATCH_VLSEG4E8V                0x60000007
+#define MASK_VLSEG4E8V         0xfdf0707f
+#define MATCH_VSSEG4E8V                0x60000027
+#define MASK_VSSEG4E8V         0xfdf0707f
+#define MATCH_VLSEG5E8V                0x80000007
+#define MASK_VLSEG5E8V         0xfdf0707f
+#define MATCH_VSSEG5E8V                0x80000027
+#define MASK_VSSEG5E8V         0xfdf0707f
+#define MATCH_VLSEG6E8V                0xa0000007
+#define MASK_VLSEG6E8V         0xfdf0707f
+#define MATCH_VSSEG6E8V                0xa0000027
+#define MASK_VSSEG6E8V         0xfdf0707f
+#define MATCH_VLSEG7E8V                0xc0000007
+#define MASK_VLSEG7E8V         0xfdf0707f
+#define MATCH_VSSEG7E8V                0xc0000027
+#define MASK_VSSEG7E8V         0xfdf0707f
+#define MATCH_VLSEG8E8V                0xe0000007
+#define MASK_VLSEG8E8V         0xfdf0707f
+#define MATCH_VSSEG8E8V                0xe0000027
+#define MASK_VSSEG8E8V         0xfdf0707f
+#define MATCH_VLSEG2E16V       0x20005007
+#define MASK_VLSEG2E16V                0xfdf0707f
+#define MATCH_VSSEG2E16V       0x20005027
+#define MASK_VSSEG2E16V                0xfdf0707f
+#define MATCH_VLSEG3E16V       0x40005007
+#define MASK_VLSEG3E16V                0xfdf0707f
+#define MATCH_VSSEG3E16V       0x40005027
+#define MASK_VSSEG3E16V                0xfdf0707f
+#define MATCH_VLSEG4E16V       0x60005007
+#define MASK_VLSEG4E16V                0xfdf0707f
+#define MATCH_VSSEG4E16V       0x60005027
+#define MASK_VSSEG4E16V                0xfdf0707f
+#define MATCH_VLSEG5E16V       0x80005007
+#define MASK_VLSEG5E16V                0xfdf0707f
+#define MATCH_VSSEG5E16V       0x80005027
+#define MASK_VSSEG5E16V                0xfdf0707f
+#define MATCH_VLSEG6E16V       0xa0005007
+#define MASK_VLSEG6E16V                0xfdf0707f
+#define MATCH_VSSEG6E16V       0xa0005027
+#define MASK_VSSEG6E16V                0xfdf0707f
+#define MATCH_VLSEG7E16V       0xc0005007
+#define MASK_VLSEG7E16V                0xfdf0707f
+#define MATCH_VSSEG7E16V       0xc0005027
+#define MASK_VSSEG7E16V                0xfdf0707f
+#define MATCH_VLSEG8E16V       0xe0005007
+#define MASK_VLSEG8E16V                0xfdf0707f
+#define MATCH_VSSEG8E16V       0xe0005027
+#define MASK_VSSEG8E16V                0xfdf0707f
+#define MATCH_VLSEG2E32V       0x20006007
+#define MASK_VLSEG2E32V                0xfdf0707f
+#define MATCH_VSSEG2E32V       0x20006027
+#define MASK_VSSEG2E32V                0xfdf0707f
+#define MATCH_VLSEG3E32V       0x40006007
+#define MASK_VLSEG3E32V                0xfdf0707f
+#define MATCH_VSSEG3E32V       0x40006027
+#define MASK_VSSEG3E32V                0xfdf0707f
+#define MATCH_VLSEG4E32V       0x60006007
+#define MASK_VLSEG4E32V                0xfdf0707f
+#define MATCH_VSSEG4E32V       0x60006027
+#define MASK_VSSEG4E32V                0xfdf0707f
+#define MATCH_VLSEG5E32V       0x80006007
+#define MASK_VLSEG5E32V                0xfdf0707f
+#define MATCH_VSSEG5E32V       0x80006027
+#define MASK_VSSEG5E32V                0xfdf0707f
+#define MATCH_VLSEG6E32V       0xa0006007
+#define MASK_VLSEG6E32V                0xfdf0707f
+#define MATCH_VSSEG6E32V       0xa0006027
+#define MASK_VSSEG6E32V                0xfdf0707f
+#define MATCH_VLSEG7E32V       0xc0006007
+#define MASK_VLSEG7E32V                0xfdf0707f
+#define MATCH_VSSEG7E32V       0xc0006027
+#define MASK_VSSEG7E32V                0xfdf0707f
+#define MATCH_VLSEG8E32V       0xe0006007
+#define MASK_VLSEG8E32V                0xfdf0707f
+#define MATCH_VSSEG8E32V       0xe0006027
+#define MASK_VSSEG8E32V                0xfdf0707f
+#define MATCH_VLSEG2E64V       0x20007007
+#define MASK_VLSEG2E64V                0xfdf0707f
+#define MATCH_VSSEG2E64V       0x20007027
+#define MASK_VSSEG2E64V                0xfdf0707f
+#define MATCH_VLSEG3E64V       0x40007007
+#define MASK_VLSEG3E64V                0xfdf0707f
+#define MATCH_VSSEG3E64V       0x40007027
+#define MASK_VSSEG3E64V                0xfdf0707f
+#define MATCH_VLSEG4E64V       0x60007007
+#define MASK_VLSEG4E64V                0xfdf0707f
+#define MATCH_VSSEG4E64V       0x60007027
+#define MASK_VSSEG4E64V                0xfdf0707f
+#define MATCH_VLSEG5E64V       0x80007007
+#define MASK_VLSEG5E64V                0xfdf0707f
+#define MATCH_VSSEG5E64V       0x80007027
+#define MASK_VSSEG5E64V                0xfdf0707f
+#define MATCH_VLSEG6E64V       0xa0007007
+#define MASK_VLSEG6E64V                0xfdf0707f
+#define MATCH_VSSEG6E64V       0xa0007027
+#define MASK_VSSEG6E64V                0xfdf0707f
+#define MATCH_VLSEG7E64V       0xc0007007
+#define MASK_VLSEG7E64V                0xfdf0707f
+#define MATCH_VSSEG7E64V       0xc0007027
+#define MASK_VSSEG7E64V                0xfdf0707f
+#define MATCH_VLSEG8E64V       0xe0007007
+#define MASK_VLSEG8E64V                0xfdf0707f
+#define MATCH_VSSEG8E64V       0xe0007027
+#define MASK_VSSEG8E64V                0xfdf0707f
+#define MATCH_VLSSEG2E8V       0x28000007
+#define MASK_VLSSEG2E8V                0xfc00707f
+#define MATCH_VSSSEG2E8V       0x28000027
+#define MASK_VSSSEG2E8V                0xfc00707f
+#define MATCH_VLSSEG3E8V       0x48000007
+#define MASK_VLSSEG3E8V                0xfc00707f
+#define MATCH_VSSSEG3E8V       0x48000027
+#define MASK_VSSSEG3E8V                0xfc00707f
+#define MATCH_VLSSEG4E8V       0x68000007
+#define MASK_VLSSEG4E8V                0xfc00707f
+#define MATCH_VSSSEG4E8V       0x68000027
+#define MASK_VSSSEG4E8V                0xfc00707f
+#define MATCH_VLSSEG5E8V       0x88000007
+#define MASK_VLSSEG5E8V                0xfc00707f
+#define MATCH_VSSSEG5E8V       0x88000027
+#define MASK_VSSSEG5E8V                0xfc00707f
+#define MATCH_VLSSEG6E8V       0xa8000007
+#define MASK_VLSSEG6E8V                0xfc00707f
+#define MATCH_VSSSEG6E8V       0xa8000027
+#define MASK_VSSSEG6E8V                0xfc00707f
+#define MATCH_VLSSEG7E8V       0xc8000007
+#define MASK_VLSSEG7E8V                0xfc00707f
+#define MATCH_VSSSEG7E8V       0xc8000027
+#define MASK_VSSSEG7E8V                0xfc00707f
+#define MATCH_VLSSEG8E8V       0xe8000007
+#define MASK_VLSSEG8E8V                0xfc00707f
+#define MATCH_VSSSEG8E8V       0xe8000027
+#define MASK_VSSSEG8E8V                0xfc00707f
+#define MATCH_VLSSEG2E16V      0x28005007
+#define MASK_VLSSEG2E16V       0xfc00707f
+#define MATCH_VSSSEG2E16V      0x28005027
+#define MASK_VSSSEG2E16V       0xfc00707f
+#define MATCH_VLSSEG3E16V      0x48005007
+#define MASK_VLSSEG3E16V       0xfc00707f
+#define MATCH_VSSSEG3E16V      0x48005027
+#define MASK_VSSSEG3E16V       0xfc00707f
+#define MATCH_VLSSEG4E16V      0x68005007
+#define MASK_VLSSEG4E16V       0xfc00707f
+#define MATCH_VSSSEG4E16V      0x68005027
+#define MASK_VSSSEG4E16V       0xfc00707f
+#define MATCH_VLSSEG5E16V      0x88005007
+#define MASK_VLSSEG5E16V       0xfc00707f
+#define MATCH_VSSSEG5E16V      0x88005027
+#define MASK_VSSSEG5E16V       0xfc00707f
+#define MATCH_VLSSEG6E16V      0xa8005007
+#define MASK_VLSSEG6E16V       0xfc00707f
+#define MATCH_VSSSEG6E16V      0xa8005027
+#define MASK_VSSSEG6E16V       0xfc00707f
+#define MATCH_VLSSEG7E16V      0xc8005007
+#define MASK_VLSSEG7E16V       0xfc00707f
+#define MATCH_VSSSEG7E16V      0xc8005027
+#define MASK_VSSSEG7E16V       0xfc00707f
+#define MATCH_VLSSEG8E16V      0xe8005007
+#define MASK_VLSSEG8E16V       0xfc00707f
+#define MATCH_VSSSEG8E16V      0xe8005027
+#define MASK_VSSSEG8E16V       0xfc00707f
+#define MATCH_VLSSEG2E32V      0x28006007
+#define MASK_VLSSEG2E32V       0xfc00707f
+#define MATCH_VSSSEG2E32V      0x28006027
+#define MASK_VSSSEG2E32V       0xfc00707f
+#define MATCH_VLSSEG3E32V      0x48006007
+#define MASK_VLSSEG3E32V       0xfc00707f
+#define MATCH_VSSSEG3E32V      0x48006027
+#define MASK_VSSSEG3E32V       0xfc00707f
+#define MATCH_VLSSEG4E32V      0x68006007
+#define MASK_VLSSEG4E32V       0xfc00707f
+#define MATCH_VSSSEG4E32V      0x68006027
+#define MASK_VSSSEG4E32V       0xfc00707f
+#define MATCH_VLSSEG5E32V      0x88006007
+#define MASK_VLSSEG5E32V       0xfc00707f
+#define MATCH_VSSSEG5E32V      0x88006027
+#define MASK_VSSSEG5E32V       0xfc00707f
+#define MATCH_VLSSEG6E32V      0xa8006007
+#define MASK_VLSSEG6E32V       0xfc00707f
+#define MATCH_VSSSEG6E32V      0xa8006027
+#define MASK_VSSSEG6E32V       0xfc00707f
+#define MATCH_VLSSEG7E32V      0xc8006007
+#define MASK_VLSSEG7E32V       0xfc00707f
+#define MATCH_VSSSEG7E32V      0xc8006027
+#define MASK_VSSSEG7E32V       0xfc00707f
+#define MATCH_VLSSEG8E32V      0xe8006007
+#define MASK_VLSSEG8E32V       0xfc00707f
+#define MATCH_VSSSEG8E32V      0xe8006027
+#define MASK_VSSSEG8E32V       0xfc00707f
+#define MATCH_VLSSEG2E64V      0x28007007
+#define MASK_VLSSEG2E64V       0xfc00707f
+#define MATCH_VSSSEG2E64V      0x28007027
+#define MASK_VSSSEG2E64V       0xfc00707f
+#define MATCH_VLSSEG3E64V      0x48007007
+#define MASK_VLSSEG3E64V       0xfc00707f
+#define MATCH_VSSSEG3E64V      0x48007027
+#define MASK_VSSSEG3E64V       0xfc00707f
+#define MATCH_VLSSEG4E64V      0x68007007
+#define MASK_VLSSEG4E64V       0xfc00707f
+#define MATCH_VSSSEG4E64V      0x68007027
+#define MASK_VSSSEG4E64V       0xfc00707f
+#define MATCH_VLSSEG5E64V      0x88007007
+#define MASK_VLSSEG5E64V       0xfc00707f
+#define MATCH_VSSSEG5E64V      0x88007027
+#define MASK_VSSSEG5E64V       0xfc00707f
+#define MATCH_VLSSEG6E64V      0xa8007007
+#define MASK_VLSSEG6E64V       0xfc00707f
+#define MATCH_VSSSEG6E64V      0xa8007027
+#define MASK_VSSSEG6E64V       0xfc00707f
+#define MATCH_VLSSEG7E64V      0xc8007007
+#define MASK_VLSSEG7E64V       0xfc00707f
+#define MATCH_VSSSEG7E64V      0xc8007027
+#define MASK_VSSSEG7E64V       0xfc00707f
+#define MATCH_VLSSEG8E64V      0xe8007007
+#define MASK_VLSSEG8E64V       0xfc00707f
+#define MATCH_VSSSEG8E64V      0xe8007027
+#define MASK_VSSSEG8E64V       0xfc00707f
+#define MATCH_VLOXSEG2EI8V     0x2c000007
+#define MASK_VLOXSEG2EI8V      0xfc00707f
+#define MATCH_VSOXSEG2EI8V     0x2c000027
+#define MASK_VSOXSEG2EI8V      0xfc00707f
+#define MATCH_VLOXSEG3EI8V     0x4c000007
+#define MASK_VLOXSEG3EI8V      0xfc00707f
+#define MATCH_VSOXSEG3EI8V     0x4c000027
+#define MASK_VSOXSEG3EI8V      0xfc00707f
+#define MATCH_VLOXSEG4EI8V     0x6c000007
+#define MASK_VLOXSEG4EI8V      0xfc00707f
+#define MATCH_VSOXSEG4EI8V     0x6c000027
+#define MASK_VSOXSEG4EI8V      0xfc00707f
+#define MATCH_VLOXSEG5EI8V     0x8c000007
+#define MASK_VLOXSEG5EI8V      0xfc00707f
+#define MATCH_VSOXSEG5EI8V     0x8c000027
+#define MASK_VSOXSEG5EI8V      0xfc00707f
+#define MATCH_VLOXSEG6EI8V     0xac000007
+#define MASK_VLOXSEG6EI8V      0xfc00707f
+#define MATCH_VSOXSEG6EI8V     0xac000027
+#define MASK_VSOXSEG6EI8V      0xfc00707f
+#define MATCH_VLOXSEG7EI8V     0xcc000007
+#define MASK_VLOXSEG7EI8V      0xfc00707f
+#define MATCH_VSOXSEG7EI8V     0xcc000027
+#define MASK_VSOXSEG7EI8V      0xfc00707f
+#define MATCH_VLOXSEG8EI8V     0xec000007
+#define MASK_VLOXSEG8EI8V      0xfc00707f
+#define MATCH_VSOXSEG8EI8V     0xec000027
+#define MASK_VSOXSEG8EI8V      0xfc00707f
+#define MATCH_VLUXSEG2EI8V     0x24000007
+#define MASK_VLUXSEG2EI8V      0xfc00707f
+#define MATCH_VSUXSEG2EI8V     0x24000027
+#define MASK_VSUXSEG2EI8V      0xfc00707f
+#define MATCH_VLUXSEG3EI8V     0x44000007
+#define MASK_VLUXSEG3EI8V      0xfc00707f
+#define MATCH_VSUXSEG3EI8V     0x44000027
+#define MASK_VSUXSEG3EI8V      0xfc00707f
+#define MATCH_VLUXSEG4EI8V     0x64000007
+#define MASK_VLUXSEG4EI8V      0xfc00707f
+#define MATCH_VSUXSEG4EI8V     0x64000027
+#define MASK_VSUXSEG4EI8V      0xfc00707f
+#define MATCH_VLUXSEG5EI8V     0x84000007
+#define MASK_VLUXSEG5EI8V      0xfc00707f
+#define MATCH_VSUXSEG5EI8V     0x84000027
+#define MASK_VSUXSEG5EI8V      0xfc00707f
+#define MATCH_VLUXSEG6EI8V     0xa4000007
+#define MASK_VLUXSEG6EI8V      0xfc00707f
+#define MATCH_VSUXSEG6EI8V     0xa4000027
+#define MASK_VSUXSEG6EI8V      0xfc00707f
+#define MATCH_VLUXSEG7EI8V     0xc4000007
+#define MASK_VLUXSEG7EI8V      0xfc00707f
+#define MATCH_VSUXSEG7EI8V     0xc4000027
+#define MASK_VSUXSEG7EI8V      0xfc00707f
+#define MATCH_VLUXSEG8EI8V     0xe4000007
+#define MASK_VLUXSEG8EI8V      0xfc00707f
+#define MATCH_VSUXSEG8EI8V     0xe4000027
+#define MASK_VSUXSEG8EI8V      0xfc00707f
+#define MATCH_VLOXSEG2EI16V    0x2c005007
+#define MASK_VLOXSEG2EI16V     0xfc00707f
+#define MATCH_VSOXSEG2EI16V    0x2c005027
+#define MASK_VSOXSEG2EI16V     0xfc00707f
+#define MATCH_VLOXSEG3EI16V    0x4c005007
+#define MASK_VLOXSEG3EI16V     0xfc00707f
+#define MATCH_VSOXSEG3EI16V    0x4c005027
+#define MASK_VSOXSEG3EI16V     0xfc00707f
+#define MATCH_VLOXSEG4EI16V    0x6c005007
+#define MASK_VLOXSEG4EI16V     0xfc00707f
+#define MATCH_VSOXSEG4EI16V    0x6c005027
+#define MASK_VSOXSEG4EI16V     0xfc00707f
+#define MATCH_VLOXSEG5EI16V    0x8c005007
+#define MASK_VLOXSEG5EI16V     0xfc00707f
+#define MATCH_VSOXSEG5EI16V    0x8c005027
+#define MASK_VSOXSEG5EI16V     0xfc00707f
+#define MATCH_VLOXSEG6EI16V    0xac005007
+#define MASK_VLOXSEG6EI16V     0xfc00707f
+#define MATCH_VSOXSEG6EI16V    0xac005027
+#define MASK_VSOXSEG6EI16V     0xfc00707f
+#define MATCH_VLOXSEG7EI16V    0xcc005007
+#define MASK_VLOXSEG7EI16V     0xfc00707f
+#define MATCH_VSOXSEG7EI16V    0xcc005027
+#define MASK_VSOXSEG7EI16V     0xfc00707f
+#define MATCH_VLOXSEG8EI16V    0xec005007
+#define MASK_VLOXSEG8EI16V     0xfc00707f
+#define MATCH_VSOXSEG8EI16V    0xec005027
+#define MASK_VSOXSEG8EI16V     0xfc00707f
+#define MATCH_VLUXSEG2EI16V    0x24005007
+#define MASK_VLUXSEG2EI16V     0xfc00707f
+#define MATCH_VSUXSEG2EI16V    0x24005027
+#define MASK_VSUXSEG2EI16V     0xfc00707f
+#define MATCH_VLUXSEG3EI16V    0x44005007
+#define MASK_VLUXSEG3EI16V     0xfc00707f
+#define MATCH_VSUXSEG3EI16V    0x44005027
+#define MASK_VSUXSEG3EI16V     0xfc00707f
+#define MATCH_VLUXSEG4EI16V    0x64005007
+#define MASK_VLUXSEG4EI16V     0xfc00707f
+#define MATCH_VSUXSEG4EI16V    0x64005027
+#define MASK_VSUXSEG4EI16V     0xfc00707f
+#define MATCH_VLUXSEG5EI16V    0x84005007
+#define MASK_VLUXSEG5EI16V     0xfc00707f
+#define MATCH_VSUXSEG5EI16V    0x84005027
+#define MASK_VSUXSEG5EI16V     0xfc00707f
+#define MATCH_VLUXSEG6EI16V    0xa4005007
+#define MASK_VLUXSEG6EI16V     0xfc00707f
+#define MATCH_VSUXSEG6EI16V    0xa4005027
+#define MASK_VSUXSEG6EI16V     0xfc00707f
+#define MATCH_VLUXSEG7EI16V    0xc4005007
+#define MASK_VLUXSEG7EI16V     0xfc00707f
+#define MATCH_VSUXSEG7EI16V    0xc4005027
+#define MASK_VSUXSEG7EI16V     0xfc00707f
+#define MATCH_VLUXSEG8EI16V    0xe4005007
+#define MASK_VLUXSEG8EI16V     0xfc00707f
+#define MATCH_VSUXSEG8EI16V    0xe4005027
+#define MASK_VSUXSEG8EI16V     0xfc00707f
+#define MATCH_VLOXSEG2EI32V    0x2c006007
+#define MASK_VLOXSEG2EI32V     0xfc00707f
+#define MATCH_VSOXSEG2EI32V    0x2c006027
+#define MASK_VSOXSEG2EI32V     0xfc00707f
+#define MATCH_VLOXSEG3EI32V    0x4c006007
+#define MASK_VLOXSEG3EI32V     0xfc00707f
+#define MATCH_VSOXSEG3EI32V    0x4c006027
+#define MASK_VSOXSEG3EI32V     0xfc00707f
+#define MATCH_VLOXSEG4EI32V    0x6c006007
+#define MASK_VLOXSEG4EI32V     0xfc00707f
+#define MATCH_VSOXSEG4EI32V    0x6c006027
+#define MASK_VSOXSEG4EI32V     0xfc00707f
+#define MATCH_VLOXSEG5EI32V    0x8c006007
+#define MASK_VLOXSEG5EI32V     0xfc00707f
+#define MATCH_VSOXSEG5EI32V    0x8c006027
+#define MASK_VSOXSEG5EI32V     0xfc00707f
+#define MATCH_VLOXSEG6EI32V    0xac006007
+#define MASK_VLOXSEG6EI32V     0xfc00707f
+#define MATCH_VSOXSEG6EI32V    0xac006027
+#define MASK_VSOXSEG6EI32V     0xfc00707f
+#define MATCH_VLOXSEG7EI32V    0xcc006007
+#define MASK_VLOXSEG7EI32V     0xfc00707f
+#define MATCH_VSOXSEG7EI32V    0xcc006027
+#define MASK_VSOXSEG7EI32V     0xfc00707f
+#define MATCH_VLOXSEG8EI32V    0xec006007
+#define MASK_VLOXSEG8EI32V     0xfc00707f
+#define MATCH_VSOXSEG8EI32V    0xec006027
+#define MASK_VSOXSEG8EI32V     0xfc00707f
+#define MATCH_VLUXSEG2EI32V    0x24006007
+#define MASK_VLUXSEG2EI32V     0xfc00707f
+#define MATCH_VSUXSEG2EI32V    0x24006027
+#define MASK_VSUXSEG2EI32V     0xfc00707f
+#define MATCH_VLUXSEG3EI32V    0x44006007
+#define MASK_VLUXSEG3EI32V     0xfc00707f
+#define MATCH_VSUXSEG3EI32V    0x44006027
+#define MASK_VSUXSEG3EI32V     0xfc00707f
+#define MATCH_VLUXSEG4EI32V    0x64006007
+#define MASK_VLUXSEG4EI32V     0xfc00707f
+#define MATCH_VSUXSEG4EI32V    0x64006027
+#define MASK_VSUXSEG4EI32V     0xfc00707f
+#define MATCH_VLUXSEG5EI32V    0x84006007
+#define MASK_VLUXSEG5EI32V     0xfc00707f
+#define MATCH_VSUXSEG5EI32V    0x84006027
+#define MASK_VSUXSEG5EI32V     0xfc00707f
+#define MATCH_VLUXSEG6EI32V    0xa4006007
+#define MASK_VLUXSEG6EI32V     0xfc00707f
+#define MATCH_VSUXSEG6EI32V    0xa4006027
+#define MASK_VSUXSEG6EI32V     0xfc00707f
+#define MATCH_VLUXSEG7EI32V    0xc4006007
+#define MASK_VLUXSEG7EI32V     0xfc00707f
+#define MATCH_VSUXSEG7EI32V    0xc4006027
+#define MASK_VSUXSEG7EI32V     0xfc00707f
+#define MATCH_VLUXSEG8EI32V    0xe4006007
+#define MASK_VLUXSEG8EI32V     0xfc00707f
+#define MATCH_VSUXSEG8EI32V    0xe4006027
+#define MASK_VSUXSEG8EI32V     0xfc00707f
+#define MATCH_VLOXSEG2EI64V    0x2c007007
+#define MASK_VLOXSEG2EI64V     0xfc00707f
+#define MATCH_VSOXSEG2EI64V    0x2c007027
+#define MASK_VSOXSEG2EI64V     0xfc00707f
+#define MATCH_VLOXSEG3EI64V    0x4c007007
+#define MASK_VLOXSEG3EI64V     0xfc00707f
+#define MATCH_VSOXSEG3EI64V    0x4c007027
+#define MASK_VSOXSEG3EI64V     0xfc00707f
+#define MATCH_VLOXSEG4EI64V    0x6c007007
+#define MASK_VLOXSEG4EI64V     0xfc00707f
+#define MATCH_VSOXSEG4EI64V    0x6c007027
+#define MASK_VSOXSEG4EI64V     0xfc00707f
+#define MATCH_VLOXSEG5EI64V    0x8c007007
+#define MASK_VLOXSEG5EI64V     0xfc00707f
+#define MATCH_VSOXSEG5EI64V    0x8c007027
+#define MASK_VSOXSEG5EI64V     0xfc00707f
+#define MATCH_VLOXSEG6EI64V    0xac007007
+#define MASK_VLOXSEG6EI64V     0xfc00707f
+#define MATCH_VSOXSEG6EI64V    0xac007027
+#define MASK_VSOXSEG6EI64V     0xfc00707f
+#define MATCH_VLOXSEG7EI64V    0xcc007007
+#define MASK_VLOXSEG7EI64V     0xfc00707f
+#define MATCH_VSOXSEG7EI64V    0xcc007027
+#define MASK_VSOXSEG7EI64V     0xfc00707f
+#define MATCH_VLOXSEG8EI64V    0xec007007
+#define MASK_VLOXSEG8EI64V     0xfc00707f
+#define MATCH_VSOXSEG8EI64V    0xec007027
+#define MASK_VSOXSEG8EI64V     0xfc00707f
+#define MATCH_VLUXSEG2EI64V    0x24007007
+#define MASK_VLUXSEG2EI64V     0xfc00707f
+#define MATCH_VSUXSEG2EI64V    0x24007027
+#define MASK_VSUXSEG2EI64V     0xfc00707f
+#define MATCH_VLUXSEG3EI64V    0x44007007
+#define MASK_VLUXSEG3EI64V     0xfc00707f
+#define MATCH_VSUXSEG3EI64V    0x44007027
+#define MASK_VSUXSEG3EI64V     0xfc00707f
+#define MATCH_VLUXSEG4EI64V    0x64007007
+#define MASK_VLUXSEG4EI64V     0xfc00707f
+#define MATCH_VSUXSEG4EI64V    0x64007027
+#define MASK_VSUXSEG4EI64V     0xfc00707f
+#define MATCH_VLUXSEG5EI64V    0x84007007
+#define MASK_VLUXSEG5EI64V     0xfc00707f
+#define MATCH_VSUXSEG5EI64V    0x84007027
+#define MASK_VSUXSEG5EI64V     0xfc00707f
+#define MATCH_VLUXSEG6EI64V    0xa4007007
+#define MASK_VLUXSEG6EI64V     0xfc00707f
+#define MATCH_VSUXSEG6EI64V    0xa4007027
+#define MASK_VSUXSEG6EI64V     0xfc00707f
+#define MATCH_VLUXSEG7EI64V    0xc4007007
+#define MASK_VLUXSEG7EI64V     0xfc00707f
+#define MATCH_VSUXSEG7EI64V    0xc4007027
+#define MASK_VSUXSEG7EI64V     0xfc00707f
+#define MATCH_VLUXSEG8EI64V    0xe4007007
+#define MASK_VLUXSEG8EI64V     0xfc00707f
+#define MATCH_VSUXSEG8EI64V    0xe4007027
+#define MASK_VSUXSEG8EI64V     0xfc00707f
+#define MATCH_VLSEG2E8FFV      0x21000007
+#define MASK_VLSEG2E8FFV       0xfdf0707f
+#define MATCH_VLSEG3E8FFV      0x41000007
+#define MASK_VLSEG3E8FFV       0xfdf0707f
+#define MATCH_VLSEG4E8FFV      0x61000007
+#define MASK_VLSEG4E8FFV       0xfdf0707f
+#define MATCH_VLSEG5E8FFV      0x81000007
+#define MASK_VLSEG5E8FFV       0xfdf0707f
+#define MATCH_VLSEG6E8FFV      0xa1000007
+#define MASK_VLSEG6E8FFV       0xfdf0707f
+#define MATCH_VLSEG7E8FFV      0xc1000007
+#define MASK_VLSEG7E8FFV       0xfdf0707f
+#define MATCH_VLSEG8E8FFV      0xe1000007
+#define MASK_VLSEG8E8FFV       0xfdf0707f
+#define MATCH_VLSEG2E16FFV     0x21005007
+#define MASK_VLSEG2E16FFV      0xfdf0707f
+#define MATCH_VLSEG3E16FFV     0x41005007
+#define MASK_VLSEG3E16FFV      0xfdf0707f
+#define MATCH_VLSEG4E16FFV     0x61005007
+#define MASK_VLSEG4E16FFV      0xfdf0707f
+#define MATCH_VLSEG5E16FFV     0x81005007
+#define MASK_VLSEG5E16FFV      0xfdf0707f
+#define MATCH_VLSEG6E16FFV     0xa1005007
+#define MASK_VLSEG6E16FFV      0xfdf0707f
+#define MATCH_VLSEG7E16FFV     0xc1005007
+#define MASK_VLSEG7E16FFV      0xfdf0707f
+#define MATCH_VLSEG8E16FFV     0xe1005007
+#define MASK_VLSEG8E16FFV      0xfdf0707f
+#define MATCH_VLSEG2E32FFV     0x21006007
+#define MASK_VLSEG2E32FFV      0xfdf0707f
+#define MATCH_VLSEG3E32FFV     0x41006007
+#define MASK_VLSEG3E32FFV      0xfdf0707f
+#define MATCH_VLSEG4E32FFV     0x61006007
+#define MASK_VLSEG4E32FFV      0xfdf0707f
+#define MATCH_VLSEG5E32FFV     0x81006007
+#define MASK_VLSEG5E32FFV      0xfdf0707f
+#define MATCH_VLSEG6E32FFV     0xa1006007
+#define MASK_VLSEG6E32FFV      0xfdf0707f
+#define MATCH_VLSEG7E32FFV     0xc1006007
+#define MASK_VLSEG7E32FFV      0xfdf0707f
+#define MATCH_VLSEG8E32FFV     0xe1006007
+#define MASK_VLSEG8E32FFV      0xfdf0707f
+#define MATCH_VLSEG2E64FFV     0x21007007
+#define MASK_VLSEG2E64FFV      0xfdf0707f
+#define MATCH_VLSEG3E64FFV     0x41007007
+#define MASK_VLSEG3E64FFV      0xfdf0707f
+#define MATCH_VLSEG4E64FFV     0x61007007
+#define MASK_VLSEG4E64FFV      0xfdf0707f
+#define MATCH_VLSEG5E64FFV     0x81007007
+#define MASK_VLSEG5E64FFV      0xfdf0707f
+#define MATCH_VLSEG6E64FFV     0xa1007007
+#define MASK_VLSEG6E64FFV      0xfdf0707f
+#define MATCH_VLSEG7E64FFV     0xc1007007
+#define MASK_VLSEG7E64FFV      0xfdf0707f
+#define MATCH_VLSEG8E64FFV     0xe1007007
+#define MASK_VLSEG8E64FFV      0xfdf0707f
+#define MATCH_VL1RE8V          0x02800007
+#define MASK_VL1RE8V           0xfff0707f
+#define MATCH_VL1RE16V         0x02805007
+#define MASK_VL1RE16V          0xfff0707f
+#define MATCH_VL1RE32V         0x02806007
+#define MASK_VL1RE32V          0xfff0707f
+#define MATCH_VL1RE64V         0x02807007
+#define MASK_VL1RE64V          0xfff0707f
+#define MATCH_VL2RE8V          0x22800007
+#define MASK_VL2RE8V           0xfff0707f
+#define MATCH_VL2RE16V         0x22805007
+#define MASK_VL2RE16V          0xfff0707f
+#define MATCH_VL2RE32V         0x22806007
+#define MASK_VL2RE32V          0xfff0707f
+#define MATCH_VL2RE64V         0x22807007
+#define MASK_VL2RE64V          0xfff0707f
+#define MATCH_VL4RE8V          0x62800007
+#define MASK_VL4RE8V           0xfff0707f
+#define MATCH_VL4RE16V         0x62805007
+#define MASK_VL4RE16V          0xfff0707f
+#define MATCH_VL4RE32V         0x62806007
+#define MASK_VL4RE32V          0xfff0707f
+#define MATCH_VL4RE64V         0x62807007
+#define MASK_VL4RE64V          0xfff0707f
+#define MATCH_VL8RE8V          0xe2800007
+#define MASK_VL8RE8V           0xfff0707f
+#define MATCH_VL8RE16V         0xe2805007
+#define MASK_VL8RE16V          0xfff0707f
+#define MATCH_VL8RE32V         0xe2806007
+#define MASK_VL8RE32V          0xfff0707f
+#define MATCH_VL8RE64V         0xe2807007
+#define MASK_VL8RE64V          0xfff0707f
+#define MATCH_VS1RV            0x02800027
+#define MASK_VS1RV             0xfff0707f
+#define MATCH_VS2RV            0x22800027
+#define MASK_VS2RV             0xfff0707f
+#define MATCH_VS4RV            0x62800027
+#define MASK_VS4RV             0xfff0707f
+#define MATCH_VS8RV            0xe2800027
+#define MASK_VS8RV             0xfff0707f
+#define MATCH_VAMOADDEI8V      0x0000002f
+#define MASK_VAMOADDEI8V       0xf800707f
+#define MATCH_VAMOSWAPEI8V     0x0800002f
+#define MASK_VAMOSWAPEI8V      0xf800707f
+#define MATCH_VAMOXOREI8V      0x2000002f
+#define MASK_VAMOXOREI8V       0xf800707f
+#define MATCH_VAMOANDEI8V      0x6000002f
+#define MASK_VAMOANDEI8V       0xf800707f
+#define MATCH_VAMOOREI8V       0x4000002f
+#define MASK_VAMOOREI8V                0xf800707f
+#define MATCH_VAMOMINEI8V      0x8000002f
+#define MASK_VAMOMINEI8V       0xf800707f
+#define MATCH_VAMOMAXEI8V      0xa000002f
+#define MASK_VAMOMAXEI8V       0xf800707f
+#define MATCH_VAMOMINUEI8V     0xc000002f
+#define MASK_VAMOMINUEI8V      0xf800707f
+#define MATCH_VAMOMAXUEI8V     0xe000002f
+#define MASK_VAMOMAXUEI8V      0xf800707f
+#define MATCH_VAMOADDEI16V     0x0000502f
+#define MASK_VAMOADDEI16V      0xf800707f
+#define MATCH_VAMOSWAPEI16V    0x0800502f
+#define MASK_VAMOSWAPEI16V     0xf800707f
+#define MATCH_VAMOXOREI16V     0x2000502f
+#define MASK_VAMOXOREI16V      0xf800707f
+#define MATCH_VAMOANDEI16V     0x6000502f
+#define MASK_VAMOANDEI16V      0xf800707f
+#define MATCH_VAMOOREI16V      0x4000502f
+#define MASK_VAMOOREI16V       0xf800707f
+#define MATCH_VAMOMINEI16V     0x8000502f
+#define MASK_VAMOMINEI16V      0xf800707f
+#define MATCH_VAMOMAXEI16V     0xa000502f
+#define MASK_VAMOMAXEI16V      0xf800707f
+#define MATCH_VAMOMINUEI16V    0xc000502f
+#define MASK_VAMOMINUEI16V     0xf800707f
+#define MATCH_VAMOMAXUEI16V    0xe000502f
+#define MASK_VAMOMAXUEI16V     0xf800707f
+#define MATCH_VAMOADDEI32V     0x0000602f
+#define MASK_VAMOADDEI32V      0xf800707f
+#define MATCH_VAMOSWAPEI32V    0x0800602f
+#define MASK_VAMOSWAPEI32V     0xf800707f
+#define MATCH_VAMOXOREI32V     0x2000602f
+#define MASK_VAMOXOREI32V      0xf800707f
+#define MATCH_VAMOANDEI32V     0x6000602f
+#define MASK_VAMOANDEI32V      0xf800707f
+#define MATCH_VAMOOREI32V      0x4000602f
+#define MASK_VAMOOREI32V       0xf800707f
+#define MATCH_VAMOMINEI32V     0x8000602f
+#define MASK_VAMOMINEI32V      0xf800707f
+#define MATCH_VAMOMAXEI32V     0xa000602f
+#define MASK_VAMOMAXEI32V      0xf800707f
+#define MATCH_VAMOMINUEI32V    0xc000602f
+#define MASK_VAMOMINUEI32V     0xf800707f
+#define MATCH_VAMOMAXUEI32V    0xe000602f
+#define MASK_VAMOMAXUEI32V     0xf800707f
+#define MATCH_VAMOADDEI64V     0x0000702f
+#define MASK_VAMOADDEI64V      0xf800707f
+#define MATCH_VAMOSWAPEI64V    0x0800702f
+#define MASK_VAMOSWAPEI64V     0xf800707f
+#define MATCH_VAMOXOREI64V     0x2000702f
+#define MASK_VAMOXOREI64V      0xf800707f
+#define MATCH_VAMOANDEI64V     0x6000702f
+#define MASK_VAMOANDEI64V      0xf800707f
+#define MATCH_VAMOOREI64V      0x4000702f
+#define MASK_VAMOOREI64V       0xf800707f
+#define MATCH_VAMOMINEI64V     0x8000702f
+#define MASK_VAMOMINEI64V      0xf800707f
+#define MATCH_VAMOMAXEI64V     0xa000702f
+#define MASK_VAMOMAXEI64V      0xf800707f
+#define MATCH_VAMOMINUEI64V    0xc000702f
+#define MASK_VAMOMINUEI64V     0xf800707f
+#define MATCH_VAMOMAXUEI64V    0xe000702f
+#define MASK_VAMOMAXUEI64V     0xf800707f
+#define MATCH_VADDVV           0x00000057
+#define MASK_VADDVV            0xfc00707f
+#define MATCH_VADDVX           0x00004057
+#define MASK_VADDVX            0xfc00707f
+#define MATCH_VADDVI           0x00003057
+#define MASK_VADDVI            0xfc00707f
+#define MATCH_VSUBVV           0x08000057
+#define MASK_VSUBVV            0xfc00707f
+#define MATCH_VSUBVX           0x08004057
+#define MASK_VSUBVX            0xfc00707f
+#define MATCH_VRSUBVX          0x0c004057
+#define MASK_VRSUBVX           0xfc00707f
+#define MATCH_VRSUBVI          0x0c003057
+#define MASK_VRSUBVI           0xfc00707f
+#define MATCH_VWCVTXXV         0xc4006057
+#define MASK_VWCVTXXV          0xfc0ff07f
+#define MATCH_VWCVTUXXV                0xc0006057
+#define MASK_VWCVTUXXV         0xfc0ff07f
+#define MATCH_VWADDVV          0xc4002057
+#define MASK_VWADDVV           0xfc00707f
+#define MATCH_VWADDVX          0xc4006057
+#define MASK_VWADDVX           0xfc00707f
+#define MATCH_VWSUBVV          0xcc002057
+#define MASK_VWSUBVV           0xfc00707f
+#define MATCH_VWSUBVX          0xcc006057
+#define MASK_VWSUBVX           0xfc00707f
+#define MATCH_VWADDWV          0xd4002057
+#define MASK_VWADDWV           0xfc00707f
+#define MATCH_VWADDWX          0xd4006057
+#define MASK_VWADDWX           0xfc00707f
+#define MATCH_VWSUBWV          0xdc002057
+#define MASK_VWSUBWV           0xfc00707f
+#define MATCH_VWSUBWX          0xdc006057
+#define MASK_VWSUBWX           0xfc00707f
+#define MATCH_VWADDUVV         0xc0002057
+#define MASK_VWADDUVV          0xfc00707f
+#define MATCH_VWADDUVX         0xc0006057
+#define MASK_VWADDUVX          0xfc00707f
+#define MATCH_VWSUBUVV         0xc8002057
+#define MASK_VWSUBUVV          0xfc00707f
+#define MATCH_VWSUBUVX         0xc8006057
+#define MASK_VWSUBUVX          0xfc00707f
+#define MATCH_VWADDUWV         0xd0002057
+#define MASK_VWADDUWV          0xfc00707f
+#define MATCH_VWADDUWX         0xd0006057
+#define MASK_VWADDUWX          0xfc00707f
+#define MATCH_VWSUBUWV         0xd8002057
+#define MASK_VWSUBUWV          0xfc00707f
+#define MATCH_VWSUBUWX         0xd8006057
+#define MASK_VWSUBUWX          0xfc00707f
+#define MATCH_VZEXT_VF8                0x48012057
+#define MASK_VZEXT_VF8         0xfc0ff07f
+#define MATCH_VSEXT_VF8                0x4801a057
+#define MASK_VSEXT_VF8         0xfc0ff07f
+#define MATCH_VZEXT_VF4                0x48022057
+#define MASK_VZEXT_VF4         0xfc0ff07f
+#define MATCH_VSEXT_VF4                0x4802a057
+#define MASK_VSEXT_VF4         0xfc0ff07f
+#define MATCH_VZEXT_VF2                0x48032057
+#define MASK_VZEXT_VF2         0xfc0ff07f
+#define MATCH_VSEXT_VF2                0x4803a057
+#define MASK_VSEXT_VF2         0xfc0ff07f
+#define MATCH_VADCVVM          0x40000057
+#define MASK_VADCVVM           0xfe00707f
+#define MATCH_VADCVXM          0x40004057
+#define MASK_VADCVXM           0xfe00707f
+#define MATCH_VADCVIM          0x40003057
+#define MASK_VADCVIM           0xfe00707f
+#define MATCH_VMADCVVM         0x44000057
+#define MASK_VMADCVVM          0xfe00707f
+#define MATCH_VMADCVXM         0x44004057
+#define MASK_VMADCVXM          0xfe00707f
+#define MATCH_VMADCVIM         0x44003057
+#define MASK_VMADCVIM          0xfe00707f
+#define MATCH_VMADCVV          0x46000057
+#define MASK_VMADCVV           0xfe00707f
+#define MATCH_VMADCVX          0x46004057
+#define MASK_VMADCVX           0xfe00707f
+#define MATCH_VMADCVI          0x46003057
+#define MASK_VMADCVI           0xfe00707f
+#define MATCH_VSBCVVM          0x48000057
+#define MASK_VSBCVVM           0xfe00707f
+#define MATCH_VSBCVXM          0x48004057
+#define MASK_VSBCVXM           0xfe00707f
+#define MATCH_VMSBCVVM         0x4c000057
+#define MASK_VMSBCVVM          0xfe00707f
+#define MATCH_VMSBCVXM         0x4c004057
+#define MASK_VMSBCVXM          0xfe00707f
+#define MATCH_VMSBCVV          0x4e000057
+#define MASK_VMSBCVV           0xfe00707f
+#define MATCH_VMSBCVX          0x4e004057
+#define MASK_VMSBCVX           0xfe00707f
+#define MATCH_VNOTV            0x2c0fb057
+#define MASK_VNOTV             0xfc0ff07f
+#define MATCH_VANDVV           0x24000057
+#define MASK_VANDVV            0xfc00707f
+#define MATCH_VANDVX           0x24004057
+#define MASK_VANDVX            0xfc00707f
+#define MATCH_VANDVI           0x24003057
+#define MASK_VANDVI            0xfc00707f
+#define MATCH_VORVV            0x28000057
+#define MASK_VORVV             0xfc00707f
+#define MATCH_VORVX            0x28004057
+#define MASK_VORVX             0xfc00707f
+#define MATCH_VORVI            0x28003057
+#define MASK_VORVI             0xfc00707f
+#define MATCH_VXORVV           0x2c000057
+#define MASK_VXORVV            0xfc00707f
+#define MATCH_VXORVX           0x2c004057
+#define MASK_VXORVX            0xfc00707f
+#define MATCH_VXORVI           0x2c003057
+#define MASK_VXORVI            0xfc00707f
+#define MATCH_VSLLVV           0x94000057
+#define MASK_VSLLVV            0xfc00707f
+#define MATCH_VSLLVX           0x94004057
+#define MASK_VSLLVX            0xfc00707f
+#define MATCH_VSLLVI           0x94003057
+#define MASK_VSLLVI            0xfc00707f
+#define MATCH_VSRLVV           0xa0000057
+#define MASK_VSRLVV            0xfc00707f
+#define MATCH_VSRLVX           0xa0004057
+#define MASK_VSRLVX            0xfc00707f
+#define MATCH_VSRLVI           0xa0003057
+#define MASK_VSRLVI            0xfc00707f
+#define MATCH_VSRAVV           0xa4000057
+#define MASK_VSRAVV            0xfc00707f
+#define MATCH_VSRAVX           0xa4004057
+#define MASK_VSRAVX            0xfc00707f
+#define MATCH_VSRAVI           0xa4003057
+#define MASK_VSRAVI            0xfc00707f
+#define MATCH_VNCVTXXW         0xb0004057
+#define MASK_VNCVTXXW          0xfc0ff07f
+#define MATCH_VNSRLWV          0xb0000057
+#define MASK_VNSRLWV           0xfc00707f
+#define MATCH_VNSRLWX          0xb0004057
+#define MASK_VNSRLWX           0xfc00707f
+#define MATCH_VNSRLWI          0xb0003057
+#define MASK_VNSRLWI           0xfc00707f
+#define MATCH_VNSRAWV          0xb4000057
+#define MASK_VNSRAWV           0xfc00707f
+#define MATCH_VNSRAWX          0xb4004057
+#define MASK_VNSRAWX           0xfc00707f
+#define MATCH_VNSRAWI          0xb4003057
+#define MASK_VNSRAWI           0xfc00707f
+#define MATCH_VMSEQVV          0x60000057
+#define MASK_VMSEQVV           0xfc00707f
+#define MATCH_VMSEQVX          0x60004057
+#define MASK_VMSEQVX           0xfc00707f
+#define MATCH_VMSEQVI          0x60003057
+#define MASK_VMSEQVI           0xfc00707f
+#define MATCH_VMSNEVV          0x64000057
+#define MASK_VMSNEVV           0xfc00707f
+#define MATCH_VMSNEVX          0x64004057
+#define MASK_VMSNEVX           0xfc00707f
+#define MATCH_VMSNEVI          0x64003057
+#define MASK_VMSNEVI           0xfc00707f
+#define MATCH_VMSLTVV          0x6c000057
+#define MASK_VMSLTVV           0xfc00707f
+#define MATCH_VMSLTVX          0x6c004057
+#define MASK_VMSLTVX           0xfc00707f
+#define MATCH_VMSLTUVV         0x68000057
+#define MASK_VMSLTUVV          0xfc00707f
+#define MATCH_VMSLTUVX         0x68004057
+#define MASK_VMSLTUVX          0xfc00707f
+#define MATCH_VMSLEVV          0x74000057
+#define MASK_VMSLEVV           0xfc00707f
+#define MATCH_VMSLEVX          0x74004057
+#define MASK_VMSLEVX           0xfc00707f
+#define MATCH_VMSLEVI          0x74003057
+#define MASK_VMSLEVI           0xfc00707f
+#define MATCH_VMSLEUVV         0x70000057
+#define MASK_VMSLEUVV          0xfc00707f
+#define MATCH_VMSLEUVX         0x70004057
+#define MASK_VMSLEUVX          0xfc00707f
+#define MATCH_VMSLEUVI         0x70003057
+#define MASK_VMSLEUVI          0xfc00707f
+#define MATCH_VMSGTVX          0x7c004057
+#define MASK_VMSGTVX           0xfc00707f
+#define MATCH_VMSGTVI          0x7c003057
+#define MASK_VMSGTVI           0xfc00707f
+#define MATCH_VMSGTUVX         0x78004057
+#define MASK_VMSGTUVX          0xfc00707f
+#define MATCH_VMSGTUVI         0x78003057
+#define MASK_VMSGTUVI          0xfc00707f
+#define MATCH_VMINVV           0x14000057
+#define MASK_VMINVV            0xfc00707f
+#define MATCH_VMINVX           0x14004057
+#define MASK_VMINVX            0xfc00707f
+#define MATCH_VMAXVV           0x1c000057
+#define MASK_VMAXVV            0xfc00707f
+#define MATCH_VMAXVX           0x1c004057
+#define MASK_VMAXVX            0xfc00707f
+#define MATCH_VMINUVV          0x10000057
+#define MASK_VMINUVV           0xfc00707f
+#define MATCH_VMINUVX          0x10004057
+#define MASK_VMINUVX           0xfc00707f
+#define MATCH_VMAXUVV          0x18000057
+#define MASK_VMAXUVV           0xfc00707f
+#define MATCH_VMAXUVX          0x18004057
+#define MASK_VMAXUVX           0xfc00707f
+#define MATCH_VMULVV           0x94002057
+#define MASK_VMULVV            0xfc00707f
+#define MATCH_VMULVX           0x94006057
+#define MASK_VMULVX            0xfc00707f
+#define MATCH_VMULHVV          0x9c002057
+#define MASK_VMULHVV           0xfc00707f
+#define MATCH_VMULHVX          0x9c006057
+#define MASK_VMULHVX           0xfc00707f
+#define MATCH_VMULHUVV         0x90002057
+#define MASK_VMULHUVV          0xfc00707f
+#define MATCH_VMULHUVX         0x90006057
+#define MASK_VMULHUVX          0xfc00707f
+#define MATCH_VMULHSUVV                0x98002057
+#define MASK_VMULHSUVV         0xfc00707f
+#define MATCH_VMULHSUVX                0x98006057
+#define MASK_VMULHSUVX         0xfc00707f
+#define MATCH_VWMULVV          0xec002057
+#define MASK_VWMULVV           0xfc00707f
+#define MATCH_VWMULVX          0xec006057
+#define MASK_VWMULVX           0xfc00707f
+#define MATCH_VWMULUVV         0xe0002057
+#define MASK_VWMULUVV          0xfc00707f
+#define MATCH_VWMULUVX         0xe0006057
+#define MASK_VWMULUVX          0xfc00707f
+#define MATCH_VWMULSUVV                0xe8002057
+#define MASK_VWMULSUVV         0xfc00707f
+#define MATCH_VWMULSUVX                0xe8006057
+#define MASK_VWMULSUVX         0xfc00707f
+#define MATCH_VMACCVV          0xb4002057
+#define MASK_VMACCVV           0xfc00707f
+#define MATCH_VMACCVX          0xb4006057
+#define MASK_VMACCVX           0xfc00707f
+#define MATCH_VNMSACVV         0xbc002057
+#define MASK_VNMSACVV          0xfc00707f
+#define MATCH_VNMSACVX         0xbc006057
+#define MASK_VNMSACVX          0xfc00707f
+#define MATCH_VMADDVV          0xa4002057
+#define MASK_VMADDVV           0xfc00707f
+#define MATCH_VMADDVX          0xa4006057
+#define MASK_VMADDVX           0xfc00707f
+#define MATCH_VNMSUBVV         0xac002057
+#define MASK_VNMSUBVV          0xfc00707f
+#define MATCH_VNMSUBVX         0xac006057
+#define MASK_VNMSUBVX          0xfc00707f
+#define MATCH_VWMACCUVV                0xf0002057
+#define MASK_VWMACCUVV         0xfc00707f
+#define MATCH_VWMACCUVX                0xf0006057
+#define MASK_VWMACCUVX         0xfc00707f
+#define MATCH_VWMACCVV         0xf4002057
+#define MASK_VWMACCVV          0xfc00707f
+#define MATCH_VWMACCVX         0xf4006057
+#define MASK_VWMACCVX          0xfc00707f
+#define MATCH_VWMACCSUVV       0xfc002057
+#define MASK_VWMACCSUVV                0xfc00707f
+#define MATCH_VWMACCSUVX       0xfc006057
+#define MASK_VWMACCSUVX                0xfc00707f
+#define MATCH_VWMACCUSVX       0xf8006057
+#define MASK_VWMACCUSVX                0xfc00707f
+#define MATCH_VQMACCUVV                0xf0000057
+#define MASK_VQMACCUVV         0xfc00707f
+#define MATCH_VQMACCUVX                0xf0004057
+#define MASK_VQMACCUVX         0xfc00707f
+#define MATCH_VQMACCVV         0xf4000057
+#define MASK_VQMACCVV          0xfc00707f
+#define MATCH_VQMACCVX         0xf4004057
+#define MASK_VQMACCVX          0xfc00707f
+#define MATCH_VQMACCSUVV       0xfc000057
+#define MASK_VQMACCSUVV                0xfc00707f
+#define MATCH_VQMACCSUVX       0xfc004057
+#define MASK_VQMACCSUVX                0xfc00707f
+#define MATCH_VQMACCUSVX       0xf8004057
+#define MASK_VQMACCUSVX                0xfc00707f
+#define MATCH_VDIVVV           0x84002057
+#define MASK_VDIVVV            0xfc00707f
+#define MATCH_VDIVVX           0x84006057
+#define MASK_VDIVVX            0xfc00707f
+#define MATCH_VDIVUVV          0x80002057
+#define MASK_VDIVUVV           0xfc00707f
+#define MATCH_VDIVUVX          0x80006057
+#define MASK_VDIVUVX           0xfc00707f
+#define MATCH_VREMVV           0x8c002057
+#define MASK_VREMVV            0xfc00707f
+#define MATCH_VREMVX           0x8c006057
+#define MASK_VREMVX            0xfc00707f
+#define MATCH_VREMUVV          0x88002057
+#define MASK_VREMUVV           0xfc00707f
+#define MATCH_VREMUVX          0x88006057
+#define MASK_VREMUVX           0xfc00707f
+#define MATCH_VMERGEVVM                0x5c000057
+#define MASK_VMERGEVVM         0xfe00707f
+#define MATCH_VMERGEVXM                0x5c004057
+#define MASK_VMERGEVXM         0xfe00707f
+#define MATCH_VMERGEVIM                0x5c003057
+#define MASK_VMERGEVIM         0xfe00707f
+#define MATCH_VMVVV            0x5e000057
+#define MASK_VMVVV             0xfff0707f
+#define MATCH_VMVVX            0x5e004057
+#define MASK_VMVVX             0xfff0707f
+#define MATCH_VMVVI            0x5e003057
+#define MASK_VMVVI             0xfff0707f
+#define MATCH_VSADDUVV         0x80000057
+#define MASK_VSADDUVV          0xfc00707f
+#define MATCH_VSADDUVX         0x80004057
+#define MASK_VSADDUVX          0xfc00707f
+#define MATCH_VSADDUVI         0x80003057
+#define MASK_VSADDUVI          0xfc00707f
+#define MATCH_VSADDVV          0x84000057
+#define MASK_VSADDVV           0xfc00707f
+#define MATCH_VSADDVX          0x84004057
+#define MASK_VSADDVX           0xfc00707f
+#define MATCH_VSADDVI          0x84003057
+#define MASK_VSADDVI           0xfc00707f
+#define MATCH_VSSUBUVV         0x88000057
+#define MASK_VSSUBUVV          0xfc00707f
+#define MATCH_VSSUBUVX         0x88004057
+#define MASK_VSSUBUVX          0xfc00707f
+#define MATCH_VSSUBVV          0x8c000057
+#define MASK_VSSUBVV           0xfc00707f
+#define MATCH_VSSUBVX          0x8c004057
+#define MASK_VSSUBVX           0xfc00707f
+#define MATCH_VAADDUVV         0x20002057
+#define MASK_VAADDUVV          0xfc00707f
+#define MATCH_VAADDUVX         0x20006057
+#define MASK_VAADDUVX          0xfc00707f
+#define MATCH_VAADDVV          0x24002057
+#define MASK_VAADDVV           0xfc00707f
+#define MATCH_VAADDVX          0x24006057
+#define MASK_VAADDVX           0xfc00707f
+#define MATCH_VASUBUVV         0x28002057
+#define MASK_VASUBUVV          0xfc00707f
+#define MATCH_VASUBUVX         0x28006057
+#define MASK_VASUBUVX          0xfc00707f
+#define MATCH_VASUBVV          0x2c002057
+#define MASK_VASUBVV           0xfc00707f
+#define MATCH_VASUBVX          0x2c006057
+#define MASK_VASUBVX           0xfc00707f
+#define MATCH_VSMULVV          0x9c000057
+#define MASK_VSMULVV           0xfc00707f
+#define MATCH_VSMULVX          0x9c004057
+#define MASK_VSMULVX           0xfc00707f
+#define MATCH_VSSRLVV          0xa8000057
+#define MASK_VSSRLVV           0xfc00707f
+#define MATCH_VSSRLVX          0xa8004057
+#define MASK_VSSRLVX           0xfc00707f
+#define MATCH_VSSRLVI          0xa8003057
+#define MASK_VSSRLVI           0xfc00707f
+#define MATCH_VSSRAVV          0xac000057
+#define MASK_VSSRAVV           0xfc00707f
+#define MATCH_VSSRAVX          0xac004057
+#define MASK_VSSRAVX           0xfc00707f
+#define MATCH_VSSRAVI          0xac003057
+#define MASK_VSSRAVI           0xfc00707f
+#define MATCH_VNCLIPUWV                0xb8000057
+#define MASK_VNCLIPUWV         0xfc00707f
+#define MATCH_VNCLIPUWX                0xb8004057
+#define MASK_VNCLIPUWX         0xfc00707f
+#define MATCH_VNCLIPUWI                0xb8003057
+#define MASK_VNCLIPUWI         0xfc00707f
+#define MATCH_VNCLIPWV         0xbc000057
+#define MASK_VNCLIPWV          0xfc00707f
+#define MATCH_VNCLIPWX         0xbc004057
+#define MASK_VNCLIPWX          0xfc00707f
+#define MATCH_VNCLIPWI         0xbc003057
+#define MASK_VNCLIPWI          0xfc00707f
+#define MATCH_VFADDVV          0x00001057
+#define MASK_VFADDVV           0xfc00707f
+#define MATCH_VFADDVF          0x00005057
+#define MASK_VFADDVF           0xfc00707f
+#define MATCH_VFSUBVV          0x08001057
+#define MASK_VFSUBVV           0xfc00707f
+#define MATCH_VFSUBVF          0x08005057
+#define MASK_VFSUBVF           0xfc00707f
+#define MATCH_VFRSUBVF         0x9c005057
+#define MASK_VFRSUBVF          0xfc00707f
+#define MATCH_VFWADDVV         0xc0001057
+#define MASK_VFWADDVV          0xfc00707f
+#define MATCH_VFWADDVF         0xc0005057
+#define MASK_VFWADDVF          0xfc00707f
+#define MATCH_VFWSUBVV         0xc8001057
+#define MASK_VFWSUBVV          0xfc00707f
+#define MATCH_VFWSUBVF         0xc8005057
+#define MASK_VFWSUBVF          0xfc00707f
+#define MATCH_VFWADDWV         0xd0001057
+#define MASK_VFWADDWV          0xfc00707f
+#define MATCH_VFWADDWF         0xd0005057
+#define MASK_VFWADDWF          0xfc00707f
+#define MATCH_VFWSUBWV         0xd8001057
+#define MASK_VFWSUBWV          0xfc00707f
+#define MATCH_VFWSUBWF         0xd8005057
+#define MASK_VFWSUBWF          0xfc00707f
+#define MATCH_VFMULVV          0x90001057
+#define MASK_VFMULVV           0xfc00707f
+#define MATCH_VFMULVF          0x90005057
+#define MASK_VFMULVF           0xfc00707f
+#define MATCH_VFDIVVV          0x80001057
+#define MASK_VFDIVVV           0xfc00707f
+#define MATCH_VFDIVVF          0x80005057
+#define MASK_VFDIVVF           0xfc00707f
+#define MATCH_VFRDIVVF         0x84005057
+#define MASK_VFRDIVVF          0xfc00707f
+#define MATCH_VFWMULVV         0xe0001057
+#define MASK_VFWMULVV          0xfc00707f
+#define MATCH_VFWMULVF         0xe0005057
+#define MASK_VFWMULVF          0xfc00707f
+#define MATCH_VFMADDVV         0xa0001057
+#define MASK_VFMADDVV          0xfc00707f
+#define MATCH_VFMADDVF         0xa0005057
+#define MASK_VFMADDVF          0xfc00707f
+#define MATCH_VFNMADDVV                0xa4001057
+#define MASK_VFNMADDVV         0xfc00707f
+#define MATCH_VFNMADDVF                0xa4005057
+#define MASK_VFNMADDVF         0xfc00707f
+#define MATCH_VFMSUBVV         0xa8001057
+#define MASK_VFMSUBVV          0xfc00707f
+#define MATCH_VFMSUBVF         0xa8005057
+#define MASK_VFMSUBVF          0xfc00707f
+#define MATCH_VFNMSUBVV                0xac001057
+#define MASK_VFNMSUBVV         0xfc00707f
+#define MATCH_VFNMSUBVF                0xac005057
+#define MASK_VFNMSUBVF         0xfc00707f
+#define MATCH_VFMACCVV         0xb0001057
+#define MASK_VFMACCVV          0xfc00707f
+#define MATCH_VFMACCVF         0xb0005057
+#define MASK_VFMACCVF          0xfc00707f
+#define MATCH_VFNMACCVV                0xb4001057
+#define MASK_VFNMACCVV         0xfc00707f
+#define MATCH_VFNMACCVF                0xb4005057
+#define MASK_VFNMACCVF         0xfc00707f
+#define MATCH_VFMSACVV         0xb8001057
+#define MASK_VFMSACVV          0xfc00707f
+#define MATCH_VFMSACVF         0xb8005057
+#define MASK_VFMSACVF          0xfc00707f
+#define MATCH_VFNMSACVV                0xbc001057
+#define MASK_VFNMSACVV         0xfc00707f
+#define MATCH_VFNMSACVF                0xbc005057
+#define MASK_VFNMSACVF         0xfc00707f
+#define MATCH_VFWMACCVV                0xf0001057
+#define MASK_VFWMACCVV         0xfc00707f
+#define MATCH_VFWMACCVF                0xf0005057
+#define MASK_VFWMACCVF         0xfc00707f
+#define MATCH_VFWNMACCVV       0xf4001057
+#define MASK_VFWNMACCVV                0xfc00707f
+#define MATCH_VFWNMACCVF       0xf4005057
+#define MASK_VFWNMACCVF                0xfc00707f
+#define MATCH_VFWMSACVV                0xf8001057
+#define MASK_VFWMSACVV         0xfc00707f
+#define MATCH_VFWMSACVF                0xf8005057
+#define MASK_VFWMSACVF         0xfc00707f
+#define MATCH_VFWNMSACVV       0xfc001057
+#define MASK_VFWNMSACVV                0xfc00707f
+#define MATCH_VFWNMSACVF       0xfc005057
+#define MASK_VFWNMSACVF                0xfc00707f
+#define MATCH_VFSQRTV          0x4c001057
+#define MASK_VFSQRTV           0xfc0ff07f
+#define MATCH_VFRSQRT7V                0x4c021057
+#define MASK_VFRSQRT7V         0xfc0ff07f
+#define MATCH_VFREC7V          0x4c029057
+#define MASK_VFREC7V           0xfc0ff07f
+#define MATCH_VFCLASSV         0x4c081057
+#define MASK_VFCLASSV          0xfc0ff07f
+#define MATCH_VFMINVV          0x10001057
+#define MASK_VFMINVV           0xfc00707f
+#define MATCH_VFMINVF          0x10005057
+#define MASK_VFMINVF           0xfc00707f
+#define MATCH_VFMAXVV          0x18001057
+#define MASK_VFMAXVV           0xfc00707f
+#define MATCH_VFMAXVF          0x18005057
+#define MASK_VFMAXVF           0xfc00707f
+#define MATCH_VFSGNJVV         0x20001057
+#define MASK_VFSGNJVV          0xfc00707f
+#define MATCH_VFSGNJVF         0x20005057
+#define MASK_VFSGNJVF          0xfc00707f
+#define MATCH_VFSGNJNVV                0x24001057
+#define MASK_VFSGNJNVV         0xfc00707f
+#define MATCH_VFSGNJNVF                0x24005057
+#define MASK_VFSGNJNVF         0xfc00707f
+#define MATCH_VFSGNJXVV                0x28001057
+#define MASK_VFSGNJXVV         0xfc00707f
+#define MATCH_VFSGNJXVF                0x28005057
+#define MASK_VFSGNJXVF         0xfc00707f
+#define MATCH_VMFEQVV          0x60001057
+#define MASK_VMFEQVV           0xfc00707f
+#define MATCH_VMFEQVF          0x60005057
+#define MASK_VMFEQVF           0xfc00707f
+#define MATCH_VMFNEVV          0x70001057
+#define MASK_VMFNEVV           0xfc00707f
+#define MATCH_VMFNEVF          0x70005057
+#define MASK_VMFNEVF           0xfc00707f
+#define MATCH_VMFLTVV          0x6c001057
+#define MASK_VMFLTVV           0xfc00707f
+#define MATCH_VMFLTVF          0x6c005057
+#define MASK_VMFLTVF           0xfc00707f
+#define MATCH_VMFLEVV          0x64001057
+#define MASK_VMFLEVV           0xfc00707f
+#define MATCH_VMFLEVF          0x64005057
+#define MASK_VMFLEVF           0xfc00707f
+#define MATCH_VMFGTVF          0x74005057
+#define MASK_VMFGTVF           0xfc00707f
+#define MATCH_VMFGEVF          0x7c005057
+#define MASK_VMFGEVF           0xfc00707f
+#define MATCH_VFMERGEVFM       0x5c005057
+#define MASK_VFMERGEVFM                0xfe00707f
+#define MATCH_VFMVVF           0x5e005057
+#define MASK_VFMVVF            0xfff0707f
+#define MATCH_VFCVTXUFV                0x48001057
+#define MASK_VFCVTXUFV         0xfc0ff07f
+#define MATCH_VFCVTXFV         0x48009057
+#define MASK_VFCVTXFV          0xfc0ff07f
+#define MATCH_VFCVTFXUV                0x48011057
+#define MASK_VFCVTFXUV         0xfc0ff07f
+#define MATCH_VFCVTFXV         0x48019057
+#define MASK_VFCVTFXV          0xfc0ff07f
+#define MATCH_VFCVTRTZXUFV     0x48031057
+#define MASK_VFCVTRTZXUFV      0xfc0ff07f
+#define MATCH_VFCVTRTZXFV      0x48039057
+#define MASK_VFCVTRTZXFV       0xfc0ff07f
+#define MATCH_VFWCVTXUFV       0x48041057
+#define MASK_VFWCVTXUFV                0xfc0ff07f
+#define MATCH_VFWCVTXFV                0x48049057
+#define MASK_VFWCVTXFV         0xfc0ff07f
+#define MATCH_VFWCVTFXUV       0x48051057
+#define MASK_VFWCVTFXUV                0xfc0ff07f
+#define MATCH_VFWCVTFXV                0x48059057
+#define MASK_VFWCVTFXV         0xfc0ff07f
+#define MATCH_VFWCVTFFV                0x48061057
+#define MASK_VFWCVTFFV         0xfc0ff07f
+#define MATCH_VFWCVTRTZXUFV    0x48071057
+#define MASK_VFWCVTRTZXUFV     0xfc0ff07f
+#define MATCH_VFWCVTRTZXFV     0x48079057
+#define MASK_VFWCVTRTZXFV      0xfc0ff07f
+#define MATCH_VFNCVTXUFW       0x48081057
+#define MASK_VFNCVTXUFW                0xfc0ff07f
+#define MATCH_VFNCVTXFW                0x48089057
+#define MASK_VFNCVTXFW         0xfc0ff07f
+#define MATCH_VFNCVTFXUW       0x48091057
+#define MASK_VFNCVTFXUW                0xfc0ff07f
+#define MATCH_VFNCVTFXW                0x48099057
+#define MASK_VFNCVTFXW         0xfc0ff07f
+#define MATCH_VFNCVTFFW                0x480a1057
+#define MASK_VFNCVTFFW         0xfc0ff07f
+#define MATCH_VFNCVTRODFFW     0x480a9057
+#define MASK_VFNCVTRODFFW      0xfc0ff07f
+#define MATCH_VFNCVTRTZXUFW    0x480b1057
+#define MASK_VFNCVTRTZXUFW     0xfc0ff07f
+#define MATCH_VFNCVTRTZXFW     0x480b9057
+#define MASK_VFNCVTRTZXFW      0xfc0ff07f
+#define MATCH_VREDSUMVS                0x00002057
+#define MASK_VREDSUMVS         0xfc00707f
+#define MATCH_VREDMAXVS                0x1c002057
+#define MASK_VREDMAXVS         0xfc00707f
+#define MATCH_VREDMAXUVS       0x18002057
+#define MASK_VREDMAXUVS                0xfc00707f
+#define MATCH_VREDMINVS                0x14002057
+#define MASK_VREDMINVS         0xfc00707f
+#define MATCH_VREDMINUVS       0x10002057
+#define MASK_VREDMINUVS                0xfc00707f
+#define MATCH_VREDANDVS                0x04002057
+#define MASK_VREDANDVS         0xfc00707f
+#define MATCH_VREDORVS         0x08002057
+#define MASK_VREDORVS          0xfc00707f
+#define MATCH_VREDXORVS                0x0c002057
+#define MASK_VREDXORVS         0xfc00707f
+#define MATCH_VWREDSUMUVS      0xc0000057
+#define MASK_VWREDSUMUVS       0xfc00707f
+#define MATCH_VWREDSUMVS       0xc4000057
+#define MASK_VWREDSUMVS                0xfc00707f
+#define MATCH_VFREDOSUMVS      0x0c001057
+#define MASK_VFREDOSUMVS       0xfc00707f
+#define MATCH_VFREDSUMVS       0x04001057
+#define MASK_VFREDSUMVS                0xfc00707f
+#define MATCH_VFREDMAXVS       0x1c001057
+#define MASK_VFREDMAXVS                0xfc00707f
+#define MATCH_VFREDMINVS       0x14001057
+#define MASK_VFREDMINVS                0xfc00707f
+#define MATCH_VFWREDOSUMVS     0xcc001057
+#define MASK_VFWREDOSUMVS      0xfc00707f
+#define MATCH_VFWREDSUMVS      0xc4001057
+#define MASK_VFWREDSUMVS       0xfc00707f
+#define MATCH_VMANDMM          0x66002057
+#define MASK_VMANDMM           0xfe00707f
+#define MATCH_VMNANDMM         0x76002057
+#define MASK_VMNANDMM          0xfe00707f
+#define MATCH_VMANDNOTMM       0x62002057
+#define MASK_VMANDNOTMM                0xfe00707f
+#define MATCH_VMXORMM          0x6e002057
+#define MASK_VMXORMM           0xfe00707f
+#define MATCH_VMORMM           0x6a002057
+#define MASK_VMORMM            0xfe00707f
+#define MATCH_VMNORMM          0x7a002057
+#define MASK_VMNORMM           0xfe00707f
+#define MATCH_VMORNOTMM                0x72002057
+#define MASK_VMORNOTMM         0xfe00707f
+#define MATCH_VMXNORMM         0x7e002057
+#define MASK_VMXNORMM          0xfe00707f
+#define MATCH_VPOPCM           0x40082057
+#define MASK_VPOPCM            0xfc0ff07f
+#define MATCH_VFIRSTM          0x4008a057
+#define MASK_VFIRSTM           0xfc0ff07f
+#define MATCH_VMSBFM           0x5000a057
+#define MASK_VMSBFM            0xfc0ff07f
+#define MATCH_VMSIFM           0x5001a057
+#define MASK_VMSIFM            0xfc0ff07f
+#define MATCH_VMSOFM           0x50012057
+#define MASK_VMSOFM            0xfc0ff07f
+#define MATCH_VIOTAM           0x50082057
+#define MASK_VIOTAM            0xfc0ff07f
+#define MATCH_VIDV             0x5008a057
+#define MASK_VIDV              0xfdfff07f
+#define MATCH_VMVXS            0x42002057
+#define MASK_VMVXS             0xfe0ff07f
+#define MATCH_VMVSX            0x42006057
+#define MASK_VMVSX             0xfff0707f
+#define MATCH_VFMVFS           0x42001057
+#define MASK_VFMVFS            0xfe0ff07f
+#define MATCH_VFMVSF           0x42005057
+#define MASK_VFMVSF            0xfff0707f
+#define MATCH_VSLIDEUPVX       0x38004057
+#define MASK_VSLIDEUPVX                0xfc00707f
+#define MATCH_VSLIDEUPVI       0x38003057
+#define MASK_VSLIDEUPVI                0xfc00707f
+#define MATCH_VSLIDEDOWNVX     0x3c004057
+#define MASK_VSLIDEDOWNVX      0xfc00707f
+#define MATCH_VSLIDEDOWNVI     0x3c003057
+#define MASK_VSLIDEDOWNVI      0xfc00707f
+#define MATCH_VSLIDE1UPVX      0x38006057
+#define MASK_VSLIDE1UPVX       0xfc00707f
+#define MATCH_VSLIDE1DOWNVX    0x3c006057
+#define MASK_VSLIDE1DOWNVX     0xfc00707f
+#define MATCH_VFSLIDE1UPVF     0x38005057
+#define MASK_VFSLIDE1UPVF      0xfc00707f
+#define MATCH_VFSLIDE1DOWNVF   0x3c005057
+#define MASK_VFSLIDE1DOWNVF    0xfc00707f
+#define MATCH_VRGATHERVV       0x30000057
+#define MASK_VRGATHERVV                0xfc00707f
+#define MATCH_VRGATHERVX       0x30004057
+#define MASK_VRGATHERVX                0xfc00707f
+#define MATCH_VRGATHERVI       0x30003057
+#define MASK_VRGATHERVI                0xfc00707f
+#define MATCH_VRGATHEREI16VV   0x38000057
+#define MASK_VRGATHEREI16VV    0xfc00707f
+#define MATCH_VCOMPRESSVM      0x5e002057
+#define MASK_VCOMPRESSVM       0xfe00707f
+#define MATCH_VMV1RV           0x9e003057
+#define MASK_VMV1RV            0xfe0ff07f
+#define MATCH_VMV2RV           0x9e00b057
+#define MASK_VMV2RV            0xfe0ff07f
+#define MATCH_VMV4RV           0x9e01b057
+#define MASK_VMV4RV            0xfe0ff07f
+#define MATCH_VMV8RV           0x9e03b057
+#define MASK_VMV8RV            0xfe0ff07f
+#define MATCH_VDOTVV           0xe4000057
+#define MASK_VDOTVV            0xfc00707f
+#define MATCH_VDOTUVV          0xe0000057
+#define MASK_VDOTUVV           0xfc00707f
+#define MATCH_VFDOTVV          0xe4001057
+#define MASK_VFDOTVV           0xfc00707f
 #endif /* RISCV_EXTENDED_ENCODING_H */
+#ifdef DECLARE_CSR
+/* Unprivileged extended CSR addresses.  */
+#define CSR_VSTART 0x008
+#define CSR_VXSAT 0x009
+#define CSR_VXRM 0x00a
+#define CSR_VCSR 0x00f
+#define CSR_VL 0xc20
+#define CSR_VTYPE 0xc21
+#define CSR_VLENB 0xc22
+/* Unprivileged extended CSRs.  */
+DECLARE_CSR(vstart, CSR_VSTART, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vxsat, CSR_VXSAT, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vxrm, CSR_VXRM, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vcsr, CSR_VCSR, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vl, CSR_VL, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vtype, CSR_VTYPE, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+DECLARE_CSR(vlenb, CSR_VLENB, CSR_CLASS_V, PRIV_SPEC_CLASS_NONE, PRIV_SPEC_CLASS_NONE)
+#endif /* DECLARE_CSR */
index a336eb42489c36916fe508f9a7ac7a6f4173620c..5cafcfb983f1f64a6902fac2fbb80aa958102379 100644 (file)
@@ -353,7 +353,8 @@ struct riscv_opcode
 
   /* A function to determine if a word corresponds to this instruction.
      Usually, this computes ((word & mask) == match).  */
-  int (*match_func) (const struct riscv_opcode *op, insn_t word);
+  int (*match_func) (const struct riscv_opcode *op, insn_t word,
+                    int constraints, const char **error);
 
   /* For a macro, this is INSN_MACRO.  Otherwise, it is a collection
      of bits describing the instruction, notably any relevant hazard
@@ -445,6 +446,79 @@ extern const struct riscv_opcode riscv_insn_types[];
 
 /* Extended extensions.  */
 
+/* RVV IMM encodings.  */
+#define EXTRACT_RVV_VI_IMM(x) \
+  (RV_X(x, 15, 5) | (-RV_X(x, 19, 1) << 5))
+#define EXTRACT_RVV_VI_UIMM(x) \
+  (RV_X(x, 15, 5))
+#define EXTRACT_RVV_OFFSET(x) \
+  (RV_X(x, 29, 3))
+#define EXTRACT_RVV_VB_IMM(x) \
+  (RV_X(x, 20, 10))
+#define EXTRACT_RVV_VC_IMM(x) \
+  (RV_X(x, 20, 11))
+#define ENCODE_RVV_VB_IMM(x) \
+  (RV_X(x, 0, 10) << 20)
+#define ENCODE_RVV_VC_IMM(x) \
+  (RV_X(x, 0, 11) << 20)
+#define VALID_RVV_VB_IMM(x) (EXTRACT_RVV_VB_IMM(ENCODE_RVV_VB_IMM(x)) == (x))
+#define VALID_RVV_VC_IMM(x) (EXTRACT_RVV_VC_IMM(ENCODE_RVV_VC_IMM(x)) == (x))
+/* RVV fields.  */
+#define OP_MASK_VD             0x1f
+#define OP_SH_VD               7
+#define OP_MASK_VS1            0x1f
+#define OP_SH_VS1              15
+#define OP_MASK_VS2            0x1f
+#define OP_SH_VS2              20
+#define OP_MASK_VIMM           0x1f
+#define OP_SH_VIMM             15
+#define OP_MASK_VMASK          0x1
+#define OP_SH_VMASK            25
+#define OP_MASK_VFUNCT6                0x3f
+#define OP_SH_VFUNCT6          26
+#define OP_MASK_VLMUL          0x7
+#define OP_SH_VLMUL            0
+#define OP_MASK_VSEW           0x7
+#define OP_SH_VSEW             3
+#define OP_MASK_VTA            0x1
+#define OP_SH_VTA              6
+#define OP_MASK_VMA            0x1
+#define OP_SH_VMA              7
+#define OP_MASK_VTYPE_RES      0x1
+#define OP_SH_VTYPE_RES                10
+#define OP_MASK_VWD            0x1
+#define OP_SH_VWD              26
+/* RVV definitions.  */
+#define NVECR 32
+#define NVECM 1
+
+/* All RISC-V extended instructions belong to at least one of
+   these classes.  */
+enum riscv_extended_insn_class
+{
+  /* Draft */
+  INSN_CLASS_V = INSN_CLASS_EXTENDED,
+  INSN_CLASS_V_AND_F,
+  INSN_CLASS_V_OR_ZVAMO,
+  INSN_CLASS_V_OR_ZVLSSEG,
+};
+
+/* This is a list of macro expanded instructions for extended
+   extensions.  */
+enum
+{
+  M_VMSGE = M_EXTENDED,
+  M_VMSGEU,
+};
+
+/* RVV */
+extern const char * const riscv_vecr_names_numeric[NVECR];
+extern const char * const riscv_vecm_names_numeric[NVECM];
+extern const char * const riscv_vsew[8];
+extern const char * const riscv_vlmul[8];
+extern const char * const riscv_vta[2];
+extern const char * const riscv_vma[2];
+
 extern const struct riscv_opcode *riscv_extended_opcodes[];
 
 #endif /* _RISCV_H_ */
index a9493a906d59b3240900cb8060dd22bf16ae0d2f..f0cf871cdbaa62e5018da5de3731f8a213a24429 100644 (file)
@@ -178,13 +178,93 @@ maybe_print_address (struct riscv_private_data *pd, int base_reg, int offset,
 
 static bool
 print_extended_insn_args (const char **opcode_args,
-                         insn_t l ATTRIBUTE_UNUSED,
-                         disassemble_info *info ATTRIBUTE_UNUSED)
+                         insn_t l,
+                         disassemble_info *info)
 {
+  fprintf_ftype print = info->fprintf_func;
   const char *oparg = *opcode_args;
 
   switch (*oparg)
     {
+    case 'V': /* RVV */
+      switch (*++oparg)
+       {
+       case 'd':
+       case 'f':
+         print (info->stream, "%s",
+                riscv_vecr_names_numeric[EXTRACT_OPERAND (VD, l)]);
+         break;
+
+       case 'e':
+         if (!EXTRACT_OPERAND (VWD, l))
+           print (info->stream, "%s", riscv_gpr_names[0]);
+         else
+           print (info->stream, "%s",
+                  riscv_vecr_names_numeric[EXTRACT_OPERAND (VD, l)]);
+         break;
+
+       case 's':
+         print (info->stream, "%s",
+                riscv_vecr_names_numeric[EXTRACT_OPERAND (VS1, l)]);
+         break;
+
+       case 't':
+       case 'u': /* VS1 == VS2 already verified at this point.  */
+       case 'v': /* VD == VS1 == VS2 already verified at this point.  */
+         print (info->stream, "%s",
+                riscv_vecr_names_numeric[EXTRACT_OPERAND (VS2, l)]);
+         break;
+
+       case '0':
+         print (info->stream, "%s", riscv_vecr_names_numeric[0]);
+         break;
+
+       case 'b':
+       case 'c':
+         {
+           int imm = (*oparg == 'b') ? EXTRACT_RVV_VB_IMM (l)
+                                     : EXTRACT_RVV_VC_IMM (l);
+           unsigned int imm_vlmul = EXTRACT_OPERAND (VLMUL, imm);
+           unsigned int imm_vsew = EXTRACT_OPERAND (VSEW, imm);
+           unsigned int imm_vta = EXTRACT_OPERAND (VTA, imm);
+           unsigned int imm_vma = EXTRACT_OPERAND (VMA, imm);
+           unsigned int imm_vtype_res = EXTRACT_OPERAND (VTYPE_RES, imm);
+
+           if (imm_vsew < ARRAY_SIZE (riscv_vsew)
+               && imm_vlmul < ARRAY_SIZE (riscv_vlmul)
+               && imm_vta < ARRAY_SIZE (riscv_vta)
+               && imm_vma < ARRAY_SIZE (riscv_vma)
+               && ! imm_vtype_res)
+             print (info->stream, "%s,%s,%s,%s", riscv_vsew[imm_vsew],
+                    riscv_vlmul[imm_vlmul], riscv_vta[imm_vta],
+                    riscv_vma[imm_vma]);
+           else
+             print (info->stream, "%d", imm);
+         }
+         break;
+
+       case 'i':
+         print (info->stream, "%d", (int)EXTRACT_RVV_VI_IMM (l));
+         break;
+
+       case 'j':
+         print (info->stream, "%d", (int)EXTRACT_RVV_VI_UIMM (l));
+         break;
+
+       case 'k':
+         print (info->stream, "%d", (int)EXTRACT_RVV_OFFSET (l));
+         break;
+
+       case 'm':
+         if (! EXTRACT_OPERAND (VMASK, l))
+             print (info->stream, ",%s", riscv_vecm_names_numeric[0]);
+         break;
+
+       default:
+         return false;
+       }
+      break;
+
     default:
       return false;
     }
@@ -511,7 +591,7 @@ riscv_disassemble_opcode (insn_t word,
          for (; op->name; op++)
            {
              /* Does the opcode match?  */
-             if (! (op->match_func) (op, word))
+             if (! (op->match_func) (op, word, 0, NULL))
                continue;
              /* Is this a pseudo-instruction and may we print it as such?  */
              if (no_aliases && (op->pinfo & INSN_ALIAS))
index 8eab231628733f619f95471a163c422602048050..20bf090489e6b91ce49fee0ace2e4cc89bb52a45 100644 (file)
@@ -87,65 +87,91 @@ const char * const riscv_fpr_names_abi[NFPR] =
 #define MATCH_SHAMT_ORC_B (0b00111 << OP_SH_SHAMT)
 
 static int
-match_opcode (const struct riscv_opcode *op, insn_t insn)
+match_opcode (const struct riscv_opcode *op,
+             insn_t insn,
+             int constraints ATTRIBUTE_UNUSED,
+             const char **error ATTRIBUTE_UNUSED)
 {
   return ((insn ^ op->match) & op->mask) == 0;
 }
 
 static int
 match_never (const struct riscv_opcode *op ATTRIBUTE_UNUSED,
-            insn_t insn ATTRIBUTE_UNUSED)
+            insn_t insn ATTRIBUTE_UNUSED,
+            int constraints ATTRIBUTE_UNUSED,
+            const char **error ATTRIBUTE_UNUSED)
 {
   return 0;
 }
 
 static int
-match_rs1_eq_rs2 (const struct riscv_opcode *op, insn_t insn)
+match_rs1_eq_rs2 (const struct riscv_opcode *op,
+                 insn_t insn,
+                 int constraints ATTRIBUTE_UNUSED,
+                 const char **error ATTRIBUTE_UNUSED)
 {
   int rs1 = (insn & MASK_RS1) >> OP_SH_RS1;
   int rs2 = (insn & MASK_RS2) >> OP_SH_RS2;
-  return match_opcode (op, insn) && rs1 == rs2;
+  return match_opcode (op, insn, 0, NULL) && rs1 == rs2;
 }
 
 static int
-match_rd_nonzero (const struct riscv_opcode *op, insn_t insn)
+match_rd_nonzero (const struct riscv_opcode *op,
+                 insn_t insn,
+                 int constraints ATTRIBUTE_UNUSED,
+                 const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && ((insn & MASK_RD) != 0);
+  return match_opcode (op, insn, 0, NULL) && ((insn & MASK_RD) != 0);
 }
 
 static int
-match_c_add (const struct riscv_opcode *op, insn_t insn)
+match_c_add (const struct riscv_opcode *op,
+            insn_t insn,
+            int constraints ATTRIBUTE_UNUSED,
+            const char **error ATTRIBUTE_UNUSED)
 {
-  return match_rd_nonzero (op, insn) && ((insn & MASK_CRS2) != 0);
+  return match_rd_nonzero (op, insn, 0, NULL) && ((insn & MASK_CRS2) != 0);
 }
 
 /* We don't allow mv zero,X to become a c.mv hint, so we need a separate
    matching function for this.  */
 
 static int
-match_c_add_with_hint (const struct riscv_opcode *op, insn_t insn)
+match_c_add_with_hint (const struct riscv_opcode *op, insn_t insn,
+                      int constraints ATTRIBUTE_UNUSED,
+                      const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && ((insn & MASK_CRS2) != 0);
+  return match_opcode (op, insn, 0, NULL) && ((insn & MASK_CRS2) != 0);
 }
 
 static int
-match_c_nop (const struct riscv_opcode *op, insn_t insn)
+match_c_nop (const struct riscv_opcode *op,
+            insn_t insn,
+            int constraints ATTRIBUTE_UNUSED,
+            const char **error ATTRIBUTE_UNUSED)
 {
-  return (match_opcode (op, insn)
+  return (match_opcode (op, insn, 0, NULL)
          && (((insn & MASK_RD) >> OP_SH_RD) == 0));
 }
 
 static int
-match_c_addi16sp (const struct riscv_opcode *op, insn_t insn)
+match_c_addi16sp (const struct riscv_opcode *op,
+                 insn_t insn,
+                 int constraints ATTRIBUTE_UNUSED,
+                 const char **error ATTRIBUTE_UNUSED)
 {
-  return (match_opcode (op, insn)
-         && (((insn & MASK_RD) >> OP_SH_RD) == 2));
+  return (match_opcode (op, insn, 0, NULL)
+         && (((insn & MASK_RD) >> OP_SH_RD) == 2)
+         && EXTRACT_CITYPE_ADDI16SP_IMM (insn) != 0);
 }
 
 static int
-match_c_lui (const struct riscv_opcode *op, insn_t insn)
+match_c_lui (const struct riscv_opcode *op,
+            insn_t insn,
+            int constraints ATTRIBUTE_UNUSED,
+            const char **error ATTRIBUTE_UNUSED)
 {
-  return (match_rd_nonzero (op, insn)
+  return (match_rd_nonzero (op, insn, 0, NULL)
          && (((insn & MASK_RD) >> OP_SH_RD) != 2)
          && EXTRACT_CITYPE_LUI_IMM (insn) != 0);
 }
@@ -154,50 +180,70 @@ match_c_lui (const struct riscv_opcode *op, insn_t insn)
    matching function for this.  */
 
 static int
-match_c_lui_with_hint (const struct riscv_opcode *op, insn_t insn)
+match_c_lui_with_hint (const struct riscv_opcode *op,
+                      insn_t insn,
+                      int constraints ATTRIBUTE_UNUSED,
+                      const char **error ATTRIBUTE_UNUSED)
 {
-  return (match_opcode (op, insn)
+  return (match_opcode (op, insn, 0, NULL)
          && (((insn & MASK_RD) >> OP_SH_RD) != 2)
          && EXTRACT_CITYPE_LUI_IMM (insn) != 0);
 }
 
 static int
-match_c_addi4spn (const struct riscv_opcode *op, insn_t insn)
+match_c_addi4spn (const struct riscv_opcode *op,
+                 insn_t insn,
+                 int constraints ATTRIBUTE_UNUSED,
+                 const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && EXTRACT_CIWTYPE_ADDI4SPN_IMM (insn) != 0;
+  return (match_opcode (op, insn, 0, NULL)
+         && EXTRACT_CIWTYPE_ADDI4SPN_IMM (insn) != 0);
 }
 
 /* This requires a non-zero shift.  A zero rd is a hint, so is allowed.  */
 
 static int
-match_c_slli (const struct riscv_opcode *op, insn_t insn)
+match_c_slli (const struct riscv_opcode *op,
+             insn_t insn,
+             int constraints ATTRIBUTE_UNUSED,
+             const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && EXTRACT_CITYPE_IMM (insn) != 0;
+  return match_opcode (op, insn, 0, NULL) && EXTRACT_CITYPE_IMM (insn) != 0;
 }
 
 /* This requires a non-zero rd, and a non-zero shift.  */
 
 static int
-match_slli_as_c_slli (const struct riscv_opcode *op, insn_t insn)
+match_slli_as_c_slli (const struct riscv_opcode *op,
+                     insn_t insn,
+                     int constraints ATTRIBUTE_UNUSED,
+                     const char **error ATTRIBUTE_UNUSED)
 {
-  return match_rd_nonzero (op, insn) && EXTRACT_CITYPE_IMM (insn) != 0;
+  return (match_rd_nonzero (op, insn, 0, NULL)
+         && EXTRACT_CITYPE_IMM (insn) != 0);
 }
 
 /* This requires a zero shift.  A zero rd is a hint, so is allowed.  */
 
 static int
-match_c_slli64 (const struct riscv_opcode *op, insn_t insn)
+match_c_slli64 (const struct riscv_opcode *op,
+               insn_t insn,
+               int constraints ATTRIBUTE_UNUSED,
+               const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && EXTRACT_CITYPE_IMM (insn) == 0;
+  return match_opcode (op, insn, 0, NULL) && EXTRACT_CITYPE_IMM (insn) == 0;
 }
 
 /* This is used for both srli and srai.  This requires a non-zero shift.
    A zero rd is not possible.  */
 
 static int
-match_srxi_as_c_srxi (const struct riscv_opcode *op, insn_t insn)
+match_srxi_as_c_srxi (const struct riscv_opcode *op,
+                     insn_t insn,
+                     int constraints ATTRIBUTE_UNUSED,
+                     const char **error ATTRIBUTE_UNUSED)
 {
-  return match_opcode (op, insn) && EXTRACT_CITYPE_IMM (insn) != 0;
+  return match_opcode (op, insn, 0, NULL) && EXTRACT_CITYPE_IMM (insn) != 0;
 }
 
 const struct riscv_opcode riscv_opcodes[] =
@@ -964,8 +1010,1165 @@ const struct riscv_opcode riscv_insn_types[] =
 
 /* Extended extensions.  */
 
+/* RVV registers.  */
+const char * const riscv_vecr_names_numeric[NVECR] =
+{
+  "v0",   "v1",   "v2",   "v3",   "v4",   "v5",   "v6",   "v7",
+  "v8",   "v9",   "v10",  "v11",  "v12",  "v13",  "v14",  "v15",
+  "v16",  "v17",  "v18",  "v19",  "v20",  "v21",  "v22",  "v23",
+  "v24",  "v25",  "v26",  "v27",  "v28",  "v29",  "v30",  "v31"
+};
+
+/* RVV mask registers.  */
+const char * const riscv_vecm_names_numeric[NVECM] =
+{
+  "v0.t"
+};
+
+/* The vsetvli vsew constants.  */
+const char * const riscv_vsew[8] =
+{
+  "e8", "e16", "e32", "e64", "e128", "e256", "e512", "e1024"
+};
+
+/* The vsetvli vlmul constants.  */
+const char * const riscv_vlmul[8] =
+{
+  "m1", "m2", "m4", "m8", 0, "mf8", "mf4", "mf2"
+};
+
+/* The vsetvli vta constants.  */
+const char * const riscv_vta[2] =
+{
+  "tu", "ta"
+};
+
+/* The vsetvli vma constants.  */
+const char * const riscv_vma[2] =
+{
+  "mu", "ma"
+};
+
+#define MASK_VD  (OP_MASK_VD << OP_SH_VD)
+#define MASK_VS1 (OP_MASK_VS1 << OP_SH_VS1)
+#define MASK_VS2 (OP_MASK_VS2 << OP_SH_VS2)
+#define MASK_VMASK (OP_MASK_VMASK << OP_SH_VMASK)
+
+static int
+match_vs1_eq_vs2 (const struct riscv_opcode *op,
+                 insn_t insn,
+                 int constraints ATTRIBUTE_UNUSED,
+                 const char **error ATTRIBUTE_UNUSED)
+{
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+
+  return match_opcode (op, insn, 0, NULL) && vs1 == vs2;
+}
+
+static int
+match_vs1_eq_vs2_neq_vm (const struct riscv_opcode *op,
+                        insn_t insn,
+                        int constraints,
+                        const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL) && vs1 == vs2;
+
+  if (!vm && vm == vd)
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL) && vs1 == vs2;
+  return 0;
+}
+
+static int
+match_vd_eq_vs1_eq_vs2 (const struct riscv_opcode *op,
+                       insn_t insn,
+                       int constraints ATTRIBUTE_UNUSED,
+                       const char **error ATTRIBUTE_UNUSED)
+{
+  int vd =  (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+
+  return match_opcode (op, insn, 0, NULL) && vd == vs1 && vs1 == vs2;
+}
+
+/* These are used to check the vector constraints.  */
+
+static int
+match_widen_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
+                                      insn_t insn,
+                                      int constraints,
+                                      const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % 2) != 0)
+    *error = "illegal operands vd must be multiple of 2";
+  else if (vs1 >= vd && vs1 <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vs1";
+  else if (vs2 >= vd && vs2 <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vs2";
+  else if (!vm && vm >= vd && vm <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_widen_vd_neq_vs1_neq_vm (const struct riscv_opcode *op,
+                              insn_t insn,
+                              int constraints,
+                              const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % 2) != 0)
+    *error = "illegal operands vd must be multiple of 2";
+  else if ((vs2 % 2) != 0)
+    *error = "illegal operands vs2 must be multiple of 2";
+  else if (vs1 >= vd && vs1 <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vs1";
+  else if (!vm && vm >= vd && vm <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_widen_vd_neq_vs2_neq_vm (const struct riscv_opcode *op,
+                              insn_t insn,
+                              int constraints,
+                              const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % 2) != 0)
+    *error = "illegal operands vd must be multiple of 2";
+  else if (vs2 >= vd && vs2 <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vs2";
+  else if (!vm && vm >= vd && vm <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_widen_vd_neq_vm (const struct riscv_opcode *op,
+                      insn_t insn,
+                      int constraints,
+                      const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % 2) != 0)
+    *error = "illegal operands vd must be multiple of 2";
+  else if ((vs2 % 2) != 0)
+    *error = "illegal operands vs2 must be multiple of 2";
+  else if (!vm && vm >= vd && vm <= (vd + 1))
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_narrow_vd_neq_vs2_neq_vm (const struct riscv_opcode *op,
+                               insn_t insn,
+                               int constraints,
+                               const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vs2 % 2) != 0)
+    *error = "illegal operands vd must be multiple of 2";
+  else if (vd >= vs2 && vd <= (vs2 + 1))
+    *error = "illegal operands vd cannot overlap vs2";
+  else if (!vm && vd >= vm && vd <= (vm + 1))
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_vd_neq_vs1_neq_vs2 (const struct riscv_opcode *op,
+                         insn_t insn,
+                         int constraints,
+                         const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if (vs1 == vd)
+    *error = "illegal operands vd cannot overlap vs1";
+  else if (vs2 == vd)
+    *error = "illegal operands vd cannot overlap vs2";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_vd_neq_vs1_neq_vs2_neq_vm (const struct riscv_opcode *op,
+                                insn_t insn,
+                                int constraints,
+                                const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs1 = (insn & MASK_VS1) >> OP_SH_VS1;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if (vs1 == vd)
+    *error = "illegal operands vd cannot overlap vs1";
+  else if (vs2 == vd)
+    *error = "illegal operands vd cannot overlap vs2";
+  else if (!vm && vm == vd)
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_vd_neq_vs2_neq_vm (const struct riscv_opcode *op,
+                        insn_t insn,
+                        int constraints,
+                        const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if (vs2 == vd)
+    *error = "illegal operands vd cannot overlap vs2";
+  else if (!vm && vm == vd)
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+/* v[m]adc and v[m]sbc use the vm encoding to encode the
+   carry-in v0 register.  The carry-in v0 register can not
+   overlap with the vd, too.  Therefore, we use the same
+   match_vd_neq_vm to check the overlap constraints.  */
+
+static int
+match_vd_neq_vm (const struct riscv_opcode *op,
+                insn_t insn,
+                int constraints,
+                const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vm = (insn & MASK_VMASK) >> OP_SH_VMASK;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if (!vm && vm == vd)
+    *error = "illegal operands vd cannot overlap vm";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_vls_nf_rv (const struct riscv_opcode *op,
+                insn_t insn,
+                int constraints,
+                const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int nf = ((insn & (0x7 << 29) ) >> 29) + 1;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % nf) != 0)
+    *error = "illegal operands vd must be multiple of nf";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+static int
+match_vmv_nf_rv (const struct riscv_opcode *op,
+                insn_t insn,
+                int constraints,
+                const char **error)
+{
+  int vd = (insn & MASK_VD) >> OP_SH_VD;
+  int vs2 = (insn & MASK_VS2) >> OP_SH_VS2;
+  int nf = ((insn & (0x7 << 15) ) >> 15) + 1;
+
+  if (!constraints || error == NULL)
+    return match_opcode (op, insn, 0, NULL);
+
+  if ((vd % nf) != 0)
+    *error = "illegal operands vd must be multiple of nf";
+  else if ((vs2 % nf) != 0)
+    *error = "illegal operands vs2 must be multiple of nf";
+  else
+    return match_opcode (op, insn, 0, NULL);
+  return 0;
+}
+
+/* Draft extensions.  */
+const struct riscv_opcode riscv_draft_opcodes[] =
+{
+/* name, xlen, isa, operands, match, mask, match_func, pinfo.  */
+/* RVV */
+{"vsetvl",     0, INSN_CLASS_V,  "d,s,t",  MATCH_VSETVL, MASK_VSETVL, match_opcode, 0},
+{"vsetvli",    0, INSN_CLASS_V,  "d,s,Vc", MATCH_VSETVLI, MASK_VSETVLI, match_opcode, 0},
+{"vsetivli",   0, INSN_CLASS_V,  "d,Z,Vb", MATCH_VSETIVLI, MASK_VSETIVLI, match_opcode, 0},
+
+{"vle1.v",     0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VLE1V, MASK_VLE1V, match_opcode, INSN_DREF },
+{"vse1.v",     0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VSE1V, MASK_VSE1V, match_opcode, INSN_DREF },
+
+{"vle8.v",     0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE8V, MASK_VLE8V, match_vd_neq_vm, INSN_DREF },
+{"vle16.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE16V, MASK_VLE16V, match_vd_neq_vm, INSN_DREF },
+{"vle32.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE32V, MASK_VLE32V, match_vd_neq_vm, INSN_DREF },
+{"vle64.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE64V, MASK_VLE64V, match_vd_neq_vm, INSN_DREF },
+
+{"vse8.v",     0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VSE8V, MASK_VSE8V, match_vd_neq_vm, INSN_DREF },
+{"vse16.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VSE16V, MASK_VSE16V, match_vd_neq_vm, INSN_DREF },
+{"vse32.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VSE32V, MASK_VSE32V, match_vd_neq_vm, INSN_DREF },
+{"vse64.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VSE64V, MASK_VSE64V, match_vd_neq_vm, INSN_DREF },
+
+{"vlse8.v",    0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VLSE8V, MASK_VLSE8V, match_vd_neq_vm, INSN_DREF },
+{"vlse16.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VLSE16V, MASK_VLSE16V, match_vd_neq_vm, INSN_DREF },
+{"vlse32.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VLSE32V, MASK_VLSE32V, match_vd_neq_vm, INSN_DREF },
+{"vlse64.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VLSE64V, MASK_VLSE64V, match_vd_neq_vm, INSN_DREF },
+
+{"vsse8.v",    0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VSSE8V, MASK_VSSE8V, match_vd_neq_vm, INSN_DREF },
+{"vsse16.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VSSE16V, MASK_VSSE16V, match_vd_neq_vm, INSN_DREF },
+{"vsse32.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VSSE32V, MASK_VSSE32V, match_vd_neq_vm, INSN_DREF },
+{"vsse64.v",   0, INSN_CLASS_V,  "Vd,0(s),tVm", MATCH_VSSE64V, MASK_VSSE64V, match_vd_neq_vm, INSN_DREF },
+
+{"vloxei8.v",   0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLOXEI8V, MASK_VLOXEI8V, match_vd_neq_vm, INSN_DREF },
+{"vloxei16.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLOXEI16V, MASK_VLOXEI16V, match_vd_neq_vm, INSN_DREF },
+{"vloxei32.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLOXEI32V, MASK_VLOXEI32V, match_vd_neq_vm, INSN_DREF },
+{"vloxei64.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLOXEI64V, MASK_VLOXEI64V, match_vd_neq_vm, INSN_DREF },
+
+{"vsoxei8.v",   0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSOXEI8V, MASK_VSOXEI8V, match_vd_neq_vm, INSN_DREF },
+{"vsoxei16.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSOXEI16V, MASK_VSOXEI16V, match_vd_neq_vm, INSN_DREF },
+{"vsoxei32.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSOXEI32V, MASK_VSOXEI32V, match_vd_neq_vm, INSN_DREF },
+{"vsoxei64.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSOXEI64V, MASK_VSOXEI64V, match_vd_neq_vm, INSN_DREF },
+
+{"vluxei8.v",   0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLUXEI8V, MASK_VLUXEI8V, match_vd_neq_vm, INSN_DREF },
+{"vluxei16.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLUXEI16V, MASK_VLUXEI16V, match_vd_neq_vm, INSN_DREF },
+{"vluxei32.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLUXEI32V, MASK_VLUXEI32V, match_vd_neq_vm, INSN_DREF },
+{"vluxei64.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VLUXEI64V, MASK_VLUXEI64V, match_vd_neq_vm, INSN_DREF },
+
+{"vsuxei8.v",   0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSUXEI8V, MASK_VSUXEI8V, match_vd_neq_vm, INSN_DREF },
+{"vsuxei16.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSUXEI16V, MASK_VSUXEI16V, match_vd_neq_vm, INSN_DREF },
+{"vsuxei32.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSUXEI32V, MASK_VSUXEI32V, match_vd_neq_vm, INSN_DREF },
+{"vsuxei64.v",  0, INSN_CLASS_V,  "Vd,0(s),VtVm", MATCH_VSUXEI64V, MASK_VSUXEI64V, match_vd_neq_vm, INSN_DREF },
+
+{"vle8ff.v",    0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE8FFV, MASK_VLE8FFV, match_vd_neq_vm, INSN_DREF },
+{"vle16ff.v",   0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE16FFV, MASK_VLE16FFV, match_vd_neq_vm, INSN_DREF },
+{"vle32ff.v",   0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE32FFV, MASK_VLE32FFV, match_vd_neq_vm, INSN_DREF },
+{"vle64ff.v",   0, INSN_CLASS_V,  "Vd,0(s)Vm", MATCH_VLE64FFV, MASK_VLE64FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E8V, MASK_VLSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG2E8V, MASK_VSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E8V, MASK_VLSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG3E8V, MASK_VSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E8V, MASK_VLSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG4E8V, MASK_VSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E8V, MASK_VLSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG5E8V, MASK_VSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E8V, MASK_VLSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG6E8V, MASK_VSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E8V, MASK_VLSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG7E8V, MASK_VSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E8V, MASK_VLSEG8E8V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG8E8V, MASK_VSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E16V, MASK_VLSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG2E16V, MASK_VSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E16V, MASK_VLSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG3E16V, MASK_VSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E16V, MASK_VLSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG4E16V, MASK_VSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E16V, MASK_VLSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG5E16V, MASK_VSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E16V, MASK_VLSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG6E16V, MASK_VSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E16V, MASK_VLSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG7E16V, MASK_VSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E16V, MASK_VLSEG8E16V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG8E16V, MASK_VSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E32V, MASK_VLSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG2E32V, MASK_VSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E32V, MASK_VLSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG3E32V, MASK_VSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E32V, MASK_VLSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG4E32V, MASK_VSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E32V, MASK_VLSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG5E32V, MASK_VSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E32V, MASK_VLSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG6E32V, MASK_VSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E32V, MASK_VLSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG7E32V, MASK_VSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E32V, MASK_VLSEG8E32V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG8E32V, MASK_VSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E64V, MASK_VLSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg2e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG2E64V, MASK_VSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E64V, MASK_VLSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg3e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG3E64V, MASK_VSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E64V, MASK_VLSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg4e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG4E64V, MASK_VSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E64V, MASK_VLSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg5e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG5E64V, MASK_VSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E64V, MASK_VLSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg6e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG6E64V, MASK_VSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E64V, MASK_VLSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg7e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG7E64V, MASK_VSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E64V, MASK_VLSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vsseg8e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VSSEG8E64V, MASK_VSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG2E8V, MASK_VLSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG2E8V, MASK_VSSSEG2E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG3E8V, MASK_VLSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG3E8V, MASK_VSSSEG3E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG4E8V, MASK_VLSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG4E8V, MASK_VSSSEG4E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG5E8V, MASK_VLSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG5E8V, MASK_VSSSEG5E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG6E8V, MASK_VLSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG6E8V, MASK_VSSSEG6E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG7E8V, MASK_VLSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG7E8V, MASK_VSSSEG7E8V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG8E8V, MASK_VLSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG8E8V, MASK_VSSSEG8E8V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG2E16V, MASK_VLSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG2E16V, MASK_VSSSEG2E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG3E16V, MASK_VLSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG3E16V, MASK_VSSSEG3E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG4E16V, MASK_VLSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG4E16V, MASK_VSSSEG4E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG5E16V, MASK_VLSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG5E16V, MASK_VSSSEG5E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG6E16V, MASK_VLSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG6E16V, MASK_VSSSEG6E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG7E16V, MASK_VLSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG7E16V, MASK_VSSSEG7E16V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG8E16V, MASK_VLSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG8E16V, MASK_VSSSEG8E16V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG2E32V, MASK_VLSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG2E32V, MASK_VSSSEG2E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG3E32V, MASK_VLSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG3E32V, MASK_VSSSEG3E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG4E32V, MASK_VLSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG4E32V, MASK_VSSSEG4E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG5E32V, MASK_VLSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG5E32V, MASK_VSSSEG5E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG6E32V, MASK_VLSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG6E32V, MASK_VSSSEG6E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG7E32V, MASK_VLSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG7E32V, MASK_VSSSEG7E32V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG8E32V, MASK_VLSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG8E32V, MASK_VSSSEG8E32V, match_vd_neq_vm, INSN_DREF },
+
+{"vlsseg2e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG2E64V, MASK_VLSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg2e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG2E64V, MASK_VSSSEG2E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg3e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG3E64V, MASK_VLSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg3e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG3E64V, MASK_VSSSEG3E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg4e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG4E64V, MASK_VLSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg4e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG4E64V, MASK_VSSSEG4E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg5e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG5E64V, MASK_VLSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg5e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG5E64V, MASK_VSSSEG5E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg6e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG6E64V, MASK_VLSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg6e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG6E64V, MASK_VSSSEG6E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg7e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG7E64V, MASK_VLSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg7e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG7E64V, MASK_VSSSEG7E64V, match_vd_neq_vm, INSN_DREF },
+{"vlsseg8e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VLSSEG8E64V, MASK_VLSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+{"vssseg8e64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),tVm", MATCH_VSSSEG8E64V, MASK_VSSSEG8E64V, match_vd_neq_vm, INSN_DREF },
+
+{"vloxseg2ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG2EI8V, MASK_VLOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG2EI8V, MASK_VSOXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG3EI8V, MASK_VLOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG3EI8V, MASK_VSOXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG4EI8V, MASK_VLOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG4EI8V, MASK_VSOXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG5EI8V, MASK_VLOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG5EI8V, MASK_VSOXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG6EI8V, MASK_VLOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG6EI8V, MASK_VSOXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG7EI8V, MASK_VLOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG7EI8V, MASK_VSOXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG8EI8V, MASK_VLOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG8EI8V, MASK_VSOXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG2EI16V, MASK_VLOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG2EI16V, MASK_VSOXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG3EI16V, MASK_VLOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG3EI16V, MASK_VSOXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG4EI16V, MASK_VLOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG4EI16V, MASK_VSOXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG5EI16V, MASK_VLOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG5EI16V, MASK_VSOXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG6EI16V, MASK_VLOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG6EI16V, MASK_VSOXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG7EI16V, MASK_VLOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG7EI16V, MASK_VSOXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG8EI16V, MASK_VLOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG8EI16V, MASK_VSOXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG2EI32V, MASK_VLOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG2EI32V, MASK_VSOXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG3EI32V, MASK_VLOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG3EI32V, MASK_VSOXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG4EI32V, MASK_VLOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG4EI32V, MASK_VSOXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG5EI32V, MASK_VLOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG5EI32V, MASK_VSOXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG6EI32V, MASK_VLOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG6EI32V, MASK_VSOXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG7EI32V, MASK_VLOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG7EI32V, MASK_VSOXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG8EI32V, MASK_VLOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG8EI32V, MASK_VSOXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vloxseg2ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG2EI64V, MASK_VLOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg2ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG2EI64V, MASK_VSOXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg3ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG3EI64V, MASK_VLOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg3ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG3EI64V, MASK_VSOXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg4ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG4EI64V, MASK_VLOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg4ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG4EI64V, MASK_VSOXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg5ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG5EI64V, MASK_VLOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg5ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG5EI64V, MASK_VSOXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg6ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG6EI64V, MASK_VLOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg6ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG6EI64V, MASK_VSOXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg7ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG7EI64V, MASK_VLOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg7ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG7EI64V, MASK_VSOXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vloxseg8ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLOXSEG8EI64V, MASK_VLOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsoxseg8ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSOXSEG8EI64V, MASK_VSOXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG2EI8V, MASK_VLUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG2EI8V, MASK_VSUXSEG2EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG3EI8V, MASK_VLUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG3EI8V, MASK_VSUXSEG3EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG4EI8V, MASK_VLUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG4EI8V, MASK_VSUXSEG4EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG5EI8V, MASK_VLUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG5EI8V, MASK_VSUXSEG5EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG6EI8V, MASK_VLUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG6EI8V, MASK_VSUXSEG6EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG7EI8V, MASK_VLUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG7EI8V, MASK_VSUXSEG7EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG8EI8V, MASK_VLUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei8.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG8EI8V, MASK_VSUXSEG8EI8V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG2EI16V, MASK_VLUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG2EI16V, MASK_VSUXSEG2EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG3EI16V, MASK_VLUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG3EI16V, MASK_VSUXSEG3EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG4EI16V, MASK_VLUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG4EI16V, MASK_VSUXSEG4EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG5EI16V, MASK_VLUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG5EI16V, MASK_VSUXSEG5EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG6EI16V, MASK_VLUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG6EI16V, MASK_VSUXSEG6EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG7EI16V, MASK_VLUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG7EI16V, MASK_VSUXSEG7EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG8EI16V, MASK_VLUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei16.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG8EI16V, MASK_VSUXSEG8EI16V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG2EI32V, MASK_VLUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG2EI32V, MASK_VSUXSEG2EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG3EI32V, MASK_VLUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG3EI32V, MASK_VSUXSEG3EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG4EI32V, MASK_VLUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG4EI32V, MASK_VSUXSEG4EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG5EI32V, MASK_VLUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG5EI32V, MASK_VSUXSEG5EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG6EI32V, MASK_VLUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG6EI32V, MASK_VSUXSEG6EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG7EI32V, MASK_VLUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG7EI32V, MASK_VSUXSEG7EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG8EI32V, MASK_VLUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei32.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG8EI32V, MASK_VSUXSEG8EI32V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vluxseg2ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG2EI64V, MASK_VLUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg2ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG2EI64V, MASK_VSUXSEG2EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg3ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG3EI64V, MASK_VLUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg3ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG3EI64V, MASK_VSUXSEG3EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg4ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG4EI64V, MASK_VLUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg4ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG4EI64V, MASK_VSUXSEG4EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg5ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG5EI64V, MASK_VLUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg5ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG5EI64V, MASK_VSUXSEG5EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg6ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG6EI64V, MASK_VLUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg6ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG6EI64V, MASK_VSUXSEG6EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg7ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG7EI64V, MASK_VLUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg7ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG7EI64V, MASK_VSUXSEG7EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vluxseg8ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VLUXSEG8EI64V, MASK_VLUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+{"vsuxseg8ei64.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s),VtVm", MATCH_VSUXSEG8EI64V, MASK_VSUXSEG8EI64V, match_vd_neq_vs2_neq_vm, INSN_DREF },
+
+{"vlseg2e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E8FFV, MASK_VLSEG2E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E8FFV, MASK_VLSEG3E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E8FFV, MASK_VLSEG4E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E8FFV, MASK_VLSEG5E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E8FFV, MASK_VLSEG6E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E8FFV, MASK_VLSEG7E8FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e8ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E8FFV, MASK_VLSEG8E8FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E16FFV, MASK_VLSEG2E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E16FFV, MASK_VLSEG3E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E16FFV, MASK_VLSEG4E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E16FFV, MASK_VLSEG5E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E16FFV, MASK_VLSEG6E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E16FFV, MASK_VLSEG7E16FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e16ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E16FFV, MASK_VLSEG8E16FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E32FFV, MASK_VLSEG2E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E32FFV, MASK_VLSEG3E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E32FFV, MASK_VLSEG4E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E32FFV, MASK_VLSEG5E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E32FFV, MASK_VLSEG6E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E32FFV, MASK_VLSEG7E32FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e32ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E32FFV, MASK_VLSEG8E32FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vlseg2e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG2E64FFV, MASK_VLSEG2E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg3e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG3E64FFV, MASK_VLSEG3E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg4e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG4E64FFV, MASK_VLSEG4E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg5e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG5E64FFV, MASK_VLSEG5E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg6e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG6E64FFV, MASK_VLSEG6E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg7e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG7E64FFV, MASK_VLSEG7E64FFV, match_vd_neq_vm, INSN_DREF },
+{"vlseg8e64ff.v",  0, INSN_CLASS_V_OR_ZVLSSEG,  "Vd,0(s)Vm", MATCH_VLSEG8E64FFV, MASK_VLSEG8E64FFV, match_vd_neq_vm, INSN_DREF },
+
+{"vl1r.v",      0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
+{"vl1re8.v",    0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL1RE8V, MASK_VL1RE8V, match_vls_nf_rv, INSN_DREF },
+{"vl1re16.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL1RE16V, MASK_VL1RE16V, match_vls_nf_rv, INSN_DREF },
+{"vl1re32.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL1RE32V, MASK_VL1RE32V, match_vls_nf_rv, INSN_DREF },
+{"vl1re64.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL1RE64V, MASK_VL1RE64V, match_vls_nf_rv, INSN_DREF },
+
+{"vl2r.v",      0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL2RE8V, MASK_VL2RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
+{"vl2re8.v",    0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL2RE8V, MASK_VL2RE8V, match_vls_nf_rv, INSN_DREF },
+{"vl2re16.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL2RE16V, MASK_VL2RE16V, match_vls_nf_rv, INSN_DREF },
+{"vl2re32.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL2RE32V, MASK_VL2RE32V, match_vls_nf_rv, INSN_DREF },
+{"vl2re64.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL2RE64V, MASK_VL2RE64V, match_vls_nf_rv, INSN_DREF },
+
+{"vl4r.v",      0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL4RE8V, MASK_VL4RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
+{"vl4re8.v",    0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL4RE8V, MASK_VL4RE8V, match_vls_nf_rv, INSN_DREF },
+{"vl4re16.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL4RE16V, MASK_VL4RE16V, match_vls_nf_rv, INSN_DREF },
+{"vl4re32.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL4RE32V, MASK_VL4RE32V, match_vls_nf_rv, INSN_DREF },
+{"vl4re64.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL4RE64V, MASK_VL4RE64V, match_vls_nf_rv, INSN_DREF },
+
+{"vl8r.v",      0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL8RE8V, MASK_VL8RE8V, match_vls_nf_rv, INSN_DREF|INSN_ALIAS },
+{"vl8re8.v",    0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL8RE8V, MASK_VL8RE8V, match_vls_nf_rv, INSN_DREF },
+{"vl8re16.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL8RE16V, MASK_VL8RE16V, match_vls_nf_rv, INSN_DREF },
+{"vl8re32.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL8RE32V, MASK_VL8RE32V, match_vls_nf_rv, INSN_DREF },
+{"vl8re64.v",   0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VL8RE64V, MASK_VL8RE64V, match_vls_nf_rv, INSN_DREF },
+
+{"vs1r.v",  0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VS1RV, MASK_VS1RV, match_vls_nf_rv, INSN_DREF },
+{"vs2r.v",  0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VS2RV, MASK_VS2RV, match_vls_nf_rv, INSN_DREF },
+{"vs4r.v",  0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VS4RV, MASK_VS4RV, match_vls_nf_rv, INSN_DREF },
+{"vs8r.v",  0, INSN_CLASS_V,  "Vd,0(s)", MATCH_VS8RV, MASK_VS8RV, match_vls_nf_rv, INSN_DREF },
+
+{"vamoaddei8.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI8V, MASK_VAMOADDEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei8.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI8V, MASK_VAMOSWAPEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei8.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI8V, MASK_VAMOXOREI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei8.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI8V, MASK_VAMOANDEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei8.v",    0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI8V, MASK_VAMOOREI8V, match_vd_neq_vm, INSN_DREF},
+{"vamominei8.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI8V, MASK_VAMOMINEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei8.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI8V, MASK_VAMOMAXEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei8.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI8V, MASK_VAMOMINUEI8V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei8.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI8V, MASK_VAMOMAXUEI8V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei16.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI16V, MASK_VAMOADDEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei16.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI16V, MASK_VAMOSWAPEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei16.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI16V, MASK_VAMOXOREI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei16.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI16V, MASK_VAMOANDEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei16.v",    0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI16V, MASK_VAMOOREI16V, match_vd_neq_vm, INSN_DREF},
+{"vamominei16.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI16V, MASK_VAMOMINEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei16.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI16V, MASK_VAMOMAXEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei16.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI16V, MASK_VAMOMINUEI16V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei16.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI16V, MASK_VAMOMAXUEI16V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei32.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI32V, MASK_VAMOADDEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei32.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI32V, MASK_VAMOSWAPEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei32.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI32V, MASK_VAMOXOREI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei32.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI32V, MASK_VAMOANDEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei32.v",    0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI32V, MASK_VAMOOREI32V, match_vd_neq_vm, INSN_DREF},
+{"vamominei32.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI32V, MASK_VAMOMINEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei32.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI32V, MASK_VAMOMAXEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei32.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI32V, MASK_VAMOMINUEI32V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei32.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI32V, MASK_VAMOMAXUEI32V, match_vd_neq_vm, INSN_DREF},
+
+{"vamoaddei64.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOADDEI64V, MASK_VAMOADDEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoswapei64.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOSWAPEI64V, MASK_VAMOSWAPEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoxorei64.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOXOREI64V, MASK_VAMOXOREI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoandei64.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOANDEI64V, MASK_VAMOANDEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamoorei64.v",    0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOOREI64V, MASK_VAMOOREI64V, match_vd_neq_vm, INSN_DREF},
+{"vamominei64.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINEI64V, MASK_VAMOMINEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxei64.v",   0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXEI64V, MASK_VAMOMAXEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamominuei64.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMINUEI64V, MASK_VAMOMINUEI64V, match_vd_neq_vm, INSN_DREF},
+{"vamomaxuei64.v",  0, INSN_CLASS_V_OR_ZVAMO,  "Ve,0(s),Vt,VfVm", MATCH_VAMOMAXUEI64V, MASK_VAMOMAXUEI64V, match_vd_neq_vm, INSN_DREF},
+
+{"vneg.v",     0, INSN_CLASS_V,  "Vd,VtVm",  MATCH_VRSUBVX, MASK_VRSUBVX | MASK_RS1, match_vd_neq_vm, INSN_ALIAS },
+
+{"vadd.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VADDVV, MASK_VADDVV, match_vd_neq_vm, 0 },
+{"vadd.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VADDVX, MASK_VADDVX, match_vd_neq_vm, 0 },
+{"vadd.vi",    0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VADDVI, MASK_VADDVI, match_vd_neq_vm, 0 },
+{"vsub.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSUBVV, MASK_VSUBVV, match_vd_neq_vm, 0 },
+{"vsub.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSUBVX, MASK_VSUBVX, match_vd_neq_vm, 0 },
+{"vrsub.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VRSUBVX, MASK_VRSUBVX, match_vd_neq_vm, 0 },
+{"vrsub.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VRSUBVI, MASK_VRSUBVI, match_vd_neq_vm, 0 },
+
+{"vwcvt.x.x.v",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VWCVTXXV, MASK_VWCVTXXV, match_widen_vd_neq_vs2_neq_vm, INSN_ALIAS },
+{"vwcvtu.x.x.v", 0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VWCVTUXXV, MASK_VWCVTUXXV, match_widen_vd_neq_vs2_neq_vm, INSN_ALIAS },
+
+{"vwaddu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWADDUVV, MASK_VWADDUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwaddu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWADDUVX, MASK_VWADDUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwsubu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWSUBUVV, MASK_VWSUBUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwsubu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWSUBUVX, MASK_VWSUBUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwadd.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWADDVV, MASK_VWADDVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwadd.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWADDVX, MASK_VWADDVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwsub.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWSUBVV, MASK_VWSUBVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwsub.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWSUBVX, MASK_VWSUBVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwaddu.wv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWADDUWV, MASK_VWADDUWV, match_widen_vd_neq_vs1_neq_vm, 0 },
+{"vwaddu.wx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWADDUWX, MASK_VWADDUWX, match_widen_vd_neq_vm, 0 },
+{"vwsubu.wv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWSUBUWV, MASK_VWSUBUWV, match_widen_vd_neq_vs1_neq_vm, 0 },
+{"vwsubu.wx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWSUBUWX, MASK_VWSUBUWX, match_widen_vd_neq_vm, 0 },
+{"vwadd.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWADDWV, MASK_VWADDWV, match_widen_vd_neq_vs1_neq_vm, 0 },
+{"vwadd.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWADDWX, MASK_VWADDWX, match_widen_vd_neq_vm, 0 },
+{"vwsub.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWSUBWV, MASK_VWSUBWV, match_widen_vd_neq_vs1_neq_vm, 0 },
+{"vwsub.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWSUBWX, MASK_VWSUBWX, match_widen_vd_neq_vm, 0 },
+
+{"vzext.vf2",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VZEXT_VF2, MASK_VZEXT_VF2, match_vd_neq_vm, 0 },
+{"vsext.vf2",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VSEXT_VF2, MASK_VSEXT_VF2, match_vd_neq_vm, 0 },
+{"vzext.vf4",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VZEXT_VF4, MASK_VZEXT_VF4, match_vd_neq_vm, 0 },
+{"vsext.vf4",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VSEXT_VF4, MASK_VSEXT_VF4, match_vd_neq_vm, 0 },
+{"vzext.vf8",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VZEXT_VF8, MASK_VZEXT_VF8, match_vd_neq_vm, 0 },
+{"vsext.vf8",  0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VSEXT_VF8, MASK_VSEXT_VF8, match_vd_neq_vm, 0 },
+
+{"vadc.vvm",   0, INSN_CLASS_V,  "Vd,Vt,Vs,V0", MATCH_VADCVVM, MASK_VADCVVM, match_vd_neq_vm, 0 },
+{"vadc.vxm",   0, INSN_CLASS_V,  "Vd,Vt,s,V0", MATCH_VADCVXM, MASK_VADCVXM, match_vd_neq_vm, 0 },
+{"vadc.vim",   0, INSN_CLASS_V,  "Vd,Vt,Vi,V0", MATCH_VADCVIM, MASK_VADCVIM, match_vd_neq_vm, 0 },
+{"vmadc.vvm",  0, INSN_CLASS_V,  "Vd,Vt,Vs,V0", MATCH_VMADCVVM, MASK_VMADCVVM, match_opcode, 0 },
+{"vmadc.vxm",  0, INSN_CLASS_V,  "Vd,Vt,s,V0", MATCH_VMADCVXM, MASK_VMADCVXM, match_opcode, 0 },
+{"vmadc.vim",  0, INSN_CLASS_V,  "Vd,Vt,Vi,V0", MATCH_VMADCVIM, MASK_VMADCVIM, match_opcode, 0 },
+{"vmadc.vv",   0, INSN_CLASS_V,  "Vd,Vt,Vs", MATCH_VMADCVV, MASK_VMADCVV, match_opcode, 0 },
+{"vmadc.vx",   0, INSN_CLASS_V,  "Vd,Vt,s", MATCH_VMADCVX, MASK_VMADCVX, match_opcode, 0 },
+{"vmadc.vi",   0, INSN_CLASS_V,  "Vd,Vt,Vi", MATCH_VMADCVI, MASK_VMADCVI, match_opcode, 0 },
+{"vsbc.vvm",   0, INSN_CLASS_V,  "Vd,Vt,Vs,V0", MATCH_VSBCVVM, MASK_VSBCVVM, match_vd_neq_vm, 0 },
+{"vsbc.vxm",   0, INSN_CLASS_V,  "Vd,Vt,s,V0", MATCH_VSBCVXM, MASK_VSBCVXM, match_vd_neq_vm, 0 },
+{"vmsbc.vvm",  0, INSN_CLASS_V,  "Vd,Vt,Vs,V0", MATCH_VMSBCVVM, MASK_VMSBCVVM, match_opcode, 0 },
+{"vmsbc.vxm",  0, INSN_CLASS_V,  "Vd,Vt,s,V0", MATCH_VMSBCVXM, MASK_VMSBCVXM, match_opcode, 0 },
+{"vmsbc.vv",   0, INSN_CLASS_V,  "Vd,Vt,Vs", MATCH_VMSBCVV, MASK_VMSBCVV, match_opcode, 0 },
+{"vmsbc.vx",   0, INSN_CLASS_V,  "Vd,Vt,s", MATCH_VMSBCVX, MASK_VMSBCVX, match_opcode, 0 },
+
+{"vnot.v",     0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VNOTV, MASK_VNOTV, match_vd_neq_vm, INSN_ALIAS },
+
+{"vand.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VANDVV, MASK_VANDVV, match_vd_neq_vm, 0 },
+{"vand.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VANDVX, MASK_VANDVX, match_vd_neq_vm, 0 },
+{"vand.vi",    0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VANDVI, MASK_VANDVI, match_vd_neq_vm, 0 },
+{"vor.vv",     0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VORVV, MASK_VORVV, match_vd_neq_vm, 0 },
+{"vor.vx",     0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VORVX, MASK_VORVX, match_vd_neq_vm, 0 },
+{"vor.vi",     0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VORVI, MASK_VORVI, match_vd_neq_vm, 0 },
+{"vxor.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VXORVV, MASK_VXORVV, match_vd_neq_vm, 0 },
+{"vxor.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VXORVX, MASK_VXORVX, match_vd_neq_vm, 0 },
+{"vxor.vi",    0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VXORVI, MASK_VXORVI, match_vd_neq_vm, 0 },
+
+{"vsll.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSLLVV, MASK_VSLLVV, match_vd_neq_vm, 0 },
+{"vsll.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSLLVX, MASK_VSLLVX, match_vd_neq_vm, 0 },
+{"vsll.vi",    0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VSLLVI, MASK_VSLLVI, match_vd_neq_vm, 0 },
+{"vsrl.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSRLVV, MASK_VSRLVV, match_vd_neq_vm, 0 },
+{"vsrl.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSRLVX, MASK_VSRLVX, match_vd_neq_vm, 0 },
+{"vsrl.vi",    0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VSRLVI, MASK_VSRLVI, match_vd_neq_vm, 0 },
+{"vsra.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSRAVV, MASK_VSRAVV, match_vd_neq_vm, 0 },
+{"vsra.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSRAVX, MASK_VSRAVX, match_vd_neq_vm, 0 },
+{"vsra.vi",    0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VSRAVI, MASK_VSRAVI, match_vd_neq_vm, 0 },
+
+{"vncvt.x.x.w",0, INSN_CLASS_V,  "Vd,VtVm", MATCH_VNCVTXXW, MASK_VNCVTXXW, match_narrow_vd_neq_vs2_neq_vm, INSN_ALIAS },
+
+{"vnsrl.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VNSRLWV, MASK_VNSRLWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsrl.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VNSRLWX, MASK_VNSRLWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsrl.wi",   0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VNSRLWI, MASK_VNSRLWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsra.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VNSRAWV, MASK_VNSRAWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsra.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VNSRAWX, MASK_VNSRAWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnsra.wi",   0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VNSRAWI, MASK_VNSRAWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+
+{"vmseq.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSEQVV, MASK_VMSEQVV, match_opcode, 0 },
+{"vmseq.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSEQVX, MASK_VMSEQVX, match_opcode, 0 },
+{"vmseq.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSEQVI, MASK_VMSEQVI, match_opcode, 0 },
+{"vmsne.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSNEVV, MASK_VMSNEVV, match_opcode, 0 },
+{"vmsne.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSNEVX, MASK_VMSNEVX, match_opcode, 0 },
+{"vmsne.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSNEVI, MASK_VMSNEVI, match_opcode, 0 },
+{"vmsltu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSLTUVV, MASK_VMSLTUVV, match_opcode, 0 },
+{"vmsltu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSLTUVX, MASK_VMSLTUVX, match_opcode, 0 },
+{"vmslt.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSLTVV, MASK_VMSLTVV, match_opcode, 0 },
+{"vmslt.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSLTVX, MASK_VMSLTVX, match_opcode, 0 },
+{"vmsleu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSLEUVV, MASK_VMSLEUVV, match_opcode, 0 },
+{"vmsleu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSLEUVX, MASK_VMSLEUVX, match_opcode, 0 },
+{"vmsleu.vi",  0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSLEUVI, MASK_VMSLEUVI, match_opcode, 0 },
+{"vmsle.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMSLEVV, MASK_VMSLEVV, match_opcode, 0 },
+{"vmsle.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSLEVX, MASK_VMSLEVX, match_opcode, 0 },
+{"vmsle.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSLEVI, MASK_VMSLEVI, match_opcode, 0 },
+{"vmsgtu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSGTUVX, MASK_VMSGTUVX, match_opcode, 0 },
+{"vmsgtu.vi",  0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSGTUVI, MASK_VMSGTUVI, match_opcode, 0 },
+{"vmsgt.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMSGTVX, MASK_VMSGTVX, match_opcode, 0 },
+{"vmsgt.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VMSGTVI, MASK_VMSGTVI, match_opcode, 0 },
+
+/* These aliases are for assembly but not disassembly.  */
+{"vmsgt.vv",   0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMSLTVV, MASK_VMSLTVV, match_opcode, INSN_ALIAS },
+{"vmsgtu.vv",  0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMSLTUVV, MASK_VMSLTUVV, match_opcode, INSN_ALIAS },
+{"vmsge.vv",   0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMSLEVV, MASK_VMSLEVV, match_opcode, INSN_ALIAS },
+{"vmsgeu.vv",  0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMSLEUVV, MASK_VMSLEUVV, match_opcode, INSN_ALIAS },
+{"vmslt.vi",   0, INSN_CLASS_V,  "Vd,Vt,VkVm", MATCH_VMSLEVI, MASK_VMSLEVI, match_opcode, INSN_ALIAS },
+{"vmsltu.vi",  0, INSN_CLASS_V,  "Vd,Vu,0Vm", MATCH_VMSNEVV, MASK_VMSNEVV, match_opcode, INSN_ALIAS },
+{"vmsltu.vi",  0, INSN_CLASS_V,  "Vd,Vt,VkVm", MATCH_VMSLEUVI, MASK_VMSLEUVI, match_opcode, INSN_ALIAS },
+{"vmsge.vi",   0, INSN_CLASS_V,  "Vd,Vt,VkVm", MATCH_VMSGTVI, MASK_VMSGTVI, match_opcode, INSN_ALIAS },
+{"vmsgeu.vi",  0, INSN_CLASS_V,  "Vd,Vu,0Vm", MATCH_VMSEQVV, MASK_VMSEQVV, match_opcode, INSN_ALIAS },
+{"vmsgeu.vi",  0, INSN_CLASS_V,  "Vd,Vt,VkVm", MATCH_VMSGTUVI, MASK_VMSGTUVI, match_opcode, INSN_ALIAS },
+
+{"vmsge.vx",   0, INSN_CLASS_V, "Vd,Vt,sVm", 0, (int) M_VMSGE, match_never, INSN_MACRO },
+{"vmsge.vx",   0, INSN_CLASS_V, "Vd,Vt,s,VM,VT", 0, (int) M_VMSGE, match_never, INSN_MACRO },
+{"vmsgeu.vx",  0, INSN_CLASS_V, "Vd,Vt,sVm", 0, (int) M_VMSGEU, match_never, INSN_MACRO },
+{"vmsgeu.vx",  0, INSN_CLASS_V, "Vd,Vt,s,VM,VT", 0, (int) M_VMSGEU, match_never, INSN_MACRO },
+
+{"vminu.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMINUVV, MASK_VMINUVV, match_vd_neq_vm, 0},
+{"vminu.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMINUVX, MASK_VMINUVX, match_vd_neq_vm, 0},
+{"vmin.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMINVV, MASK_VMINVV, match_vd_neq_vm, 0},
+{"vmin.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMINVX, MASK_VMINVX, match_vd_neq_vm, 0},
+{"vmaxu.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMAXUVV, MASK_VMAXUVV, match_vd_neq_vm, 0},
+{"vmaxu.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMAXUVX, MASK_VMAXUVX, match_vd_neq_vm, 0},
+{"vmax.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMAXVV, MASK_VMAXVV, match_vd_neq_vm, 0},
+{"vmax.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMAXVX, MASK_VMAXVX, match_vd_neq_vm, 0},
+
+{"vmul.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMULVV, MASK_VMULVV, match_vd_neq_vm, 0 },
+{"vmul.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMULVX, MASK_VMULVX, match_vd_neq_vm, 0 },
+{"vmulh.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMULHVV, MASK_VMULHVV, match_vd_neq_vm, 0 },
+{"vmulh.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMULHVX, MASK_VMULHVX, match_vd_neq_vm, 0 },
+{"vmulhu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMULHUVV, MASK_VMULHUVV, match_vd_neq_vm, 0 },
+{"vmulhu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMULHUVX, MASK_VMULHUVX, match_vd_neq_vm, 0 },
+{"vmulhsu.vv", 0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VMULHSUVV, MASK_VMULHSUVV, match_vd_neq_vm, 0 },
+{"vmulhsu.vx", 0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VMULHSUVX, MASK_VMULHSUVX, match_vd_neq_vm, 0 },
+
+{"vwmul.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWMULVV, MASK_VWMULVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmul.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWMULVX, MASK_VWMULVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwmulu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWMULUVV, MASK_VWMULUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmulu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWMULUVX, MASK_VWMULUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+{"vwmulsu.vv", 0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VWMULSUVV, MASK_VWMULSUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0 },
+{"vwmulsu.vx", 0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VWMULSUVX, MASK_VWMULSUVX, match_widen_vd_neq_vs2_neq_vm, 0 },
+
+{"vmacc.vv",   0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMACCVV, MASK_VMACCVV, match_vd_neq_vm, 0},
+{"vmacc.vx",   0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VMACCVX, MASK_VMACCVX, match_vd_neq_vm, 0},
+{"vnmsac.vv",  0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VNMSACVV, MASK_VNMSACVV, match_vd_neq_vm, 0},
+{"vnmsac.vx",  0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VNMSACVX, MASK_VNMSACVX, match_vd_neq_vm, 0},
+{"vmadd.vv",   0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VMADDVV, MASK_VMADDVV, match_vd_neq_vm, 0},
+{"vmadd.vx",   0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VMADDVX, MASK_VMADDVX, match_vd_neq_vm, 0},
+{"vnmsub.vv",  0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VNMSUBVV, MASK_VNMSUBVV, match_vd_neq_vm, 0},
+{"vnmsub.vx",  0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VNMSUBVX, MASK_VNMSUBVX, match_vd_neq_vm, 0},
+
+{"vwmaccu.vv",  0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VWMACCUVV, MASK_VWMACCUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmaccu.vx",  0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VWMACCUVX, MASK_VWMACCUVX, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vwmacc.vv",   0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VWMACCVV, MASK_VWMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmacc.vx",   0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VWMACCVX, MASK_VWMACCVX, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vwmaccsu.vv", 0, INSN_CLASS_V,  "Vd,Vs,VtVm", MATCH_VWMACCSUVV, MASK_VWMACCSUVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vwmaccsu.vx", 0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VWMACCSUVX, MASK_VWMACCSUVX, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vwmaccus.vx", 0, INSN_CLASS_V,  "Vd,s,VtVm", MATCH_VWMACCUSVX, MASK_VWMACCUSVX, match_widen_vd_neq_vs2_neq_vm, 0},
+
+{"vdivu.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VDIVUVV, MASK_VDIVUVV, match_vd_neq_vm, 0 },
+{"vdivu.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VDIVUVX, MASK_VDIVUVX, match_vd_neq_vm, 0 },
+{"vdiv.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VDIVVV, MASK_VDIVVV, match_vd_neq_vm, 0 },
+{"vdiv.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VDIVVX, MASK_VDIVVX, match_vd_neq_vm, 0 },
+{"vremu.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VREMUVV, MASK_VREMUVV, match_vd_neq_vm, 0 },
+{"vremu.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VREMUVX, MASK_VREMUVX, match_vd_neq_vm, 0 },
+{"vrem.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VREMVV, MASK_VREMVV, match_vd_neq_vm, 0 },
+{"vrem.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VREMVX, MASK_VREMVX, match_vd_neq_vm, 0 },
+
+{"vmerge.vvm", 0, INSN_CLASS_V,  "Vd,Vt,Vs,V0", MATCH_VMERGEVVM, MASK_VMERGEVVM, match_opcode, 0 },
+{"vmerge.vxm", 0, INSN_CLASS_V,  "Vd,Vt,s,V0", MATCH_VMERGEVXM, MASK_VMERGEVXM, match_opcode, 0 },
+{"vmerge.vim", 0, INSN_CLASS_V,  "Vd,Vt,Vi,V0", MATCH_VMERGEVIM, MASK_VMERGEVIM, match_opcode, 0 },
+
+{"vmv.v.v",    0, INSN_CLASS_V,  "Vd,Vs", MATCH_VMVVV, MASK_VMVVV, match_opcode, 0 },
+{"vmv.v.x",    0, INSN_CLASS_V,  "Vd,s", MATCH_VMVVX, MASK_VMVVX, match_opcode, 0 },
+{"vmv.v.i",    0, INSN_CLASS_V,  "Vd,Vi", MATCH_VMVVI, MASK_VMVVI, match_opcode, 0 },
+
+{"vsaddu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSADDUVV, MASK_VSADDUVV, match_vd_neq_vm, 0 },
+{"vsaddu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSADDUVX, MASK_VSADDUVX, match_vd_neq_vm, 0 },
+{"vsaddu.vi",  0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VSADDUVI, MASK_VSADDUVI, match_vd_neq_vm, 0 },
+{"vsadd.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSADDVV, MASK_VSADDVV, match_vd_neq_vm, 0 },
+{"vsadd.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSADDVX, MASK_VSADDVX, match_vd_neq_vm, 0 },
+{"vsadd.vi",   0, INSN_CLASS_V,  "Vd,Vt,ViVm", MATCH_VSADDVI, MASK_VSADDVI, match_vd_neq_vm, 0 },
+{"vssubu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSSUBUVV, MASK_VSSUBUVV, match_vd_neq_vm, 0 },
+{"vssubu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSSUBUVX, MASK_VSSUBUVX, match_vd_neq_vm, 0 },
+{"vssub.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSSUBVV, MASK_VSSUBVV, match_vd_neq_vm, 0 },
+{"vssub.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSSUBVX, MASK_VSSUBVX, match_vd_neq_vm, 0 },
+
+{"vaaddu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VAADDUVV, MASK_VAADDUVV, match_vd_neq_vm, 0 },
+{"vaaddu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VAADDUVX, MASK_VAADDUVX, match_vd_neq_vm, 0 },
+{"vaadd.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VAADDVV, MASK_VAADDVV, match_vd_neq_vm, 0 },
+{"vaadd.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VAADDVX, MASK_VAADDVX, match_vd_neq_vm, 0 },
+{"vasubu.vv",  0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VASUBUVV, MASK_VASUBUVV, match_vd_neq_vm, 0 },
+{"vasubu.vx",  0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VASUBUVX, MASK_VASUBUVX, match_vd_neq_vm, 0 },
+{"vasub.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VASUBVV, MASK_VASUBVV, match_vd_neq_vm, 0 },
+{"vasub.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VASUBVX, MASK_VASUBVX, match_vd_neq_vm, 0 },
+
+{"vsmul.vv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSMULVV, MASK_VSMULVV, match_vd_neq_vm, 0 },
+{"vsmul.vx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSMULVX, MASK_VSMULVX, match_vd_neq_vm, 0 },
+
+{"vssrl.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSSRLVV, MASK_VSSRLVV, match_vd_neq_vm, 0 },
+{"vssrl.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSSRLVX, MASK_VSSRLVX, match_vd_neq_vm, 0 },
+{"vssrl.vi",    0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VSSRLVI, MASK_VSSRLVI, match_vd_neq_vm, 0 },
+{"vssra.vv",    0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VSSRAVV, MASK_VSSRAVV, match_vd_neq_vm, 0 },
+{"vssra.vx",    0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VSSRAVX, MASK_VSSRAVX, match_vd_neq_vm, 0 },
+{"vssra.vi",    0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VSSRAVI, MASK_VSSRAVI, match_vd_neq_vm, 0 },
+
+{"vnclipu.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VNCLIPUWV, MASK_VNCLIPUWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclipu.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VNCLIPUWX, MASK_VNCLIPUWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclipu.wi",   0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VNCLIPUWI, MASK_VNCLIPUWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclip.wv",   0, INSN_CLASS_V,  "Vd,Vt,VsVm", MATCH_VNCLIPWV, MASK_VNCLIPWV, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclip.wx",   0, INSN_CLASS_V,  "Vd,Vt,sVm", MATCH_VNCLIPWX, MASK_VNCLIPWX, match_narrow_vd_neq_vs2_neq_vm, 0 },
+{"vnclip.wi",   0, INSN_CLASS_V,  "Vd,Vt,VjVm", MATCH_VNCLIPWI, MASK_VNCLIPWI, match_narrow_vd_neq_vs2_neq_vm, 0 },
+
+{"vfadd.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFADDVV, MASK_VFADDVV, match_vd_neq_vm, 0},
+{"vfadd.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFADDVF, MASK_VFADDVF, match_vd_neq_vm, 0},
+{"vfsub.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFSUBVV, MASK_VFSUBVV, match_vd_neq_vm, 0},
+{"vfsub.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSUBVF, MASK_VFSUBVF, match_vd_neq_vm, 0},
+{"vfrsub.vf",  0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFRSUBVF, MASK_VFRSUBVF, match_vd_neq_vm, 0},
+
+{"vfwadd.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDVV, MASK_VFWADDVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwadd.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDVF, MASK_VFWADDVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwsub.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBVV, MASK_VFWSUBVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwsub.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBVF, MASK_VFWSUBVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwadd.wv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWADDWV, MASK_VFWADDWV, match_widen_vd_neq_vs1_neq_vm, 0},
+{"vfwadd.wf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWADDWF, MASK_VFWADDWF, match_widen_vd_neq_vm, 0},
+{"vfwsub.wv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWSUBWV, MASK_VFWSUBWV, match_widen_vd_neq_vs1_neq_vm, 0},
+{"vfwsub.wf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWSUBWF, MASK_VFWSUBWF, match_widen_vd_neq_vm, 0},
+
+{"vfmul.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFMULVV, MASK_VFMULVV, match_vd_neq_vm, 0},
+{"vfmul.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFMULVF, MASK_VFMULVF, match_vd_neq_vm, 0},
+{"vfdiv.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFDIVVV, MASK_VFDIVVV, match_vd_neq_vm, 0},
+{"vfdiv.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFDIVVF, MASK_VFDIVVF, match_vd_neq_vm, 0},
+{"vfrdiv.vf",  0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFRDIVVF, MASK_VFRDIVVF, match_vd_neq_vm, 0},
+
+{"vfwmul.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWMULVV, MASK_VFWMULVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmul.vf",  0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFWMULVF, MASK_VFWMULVF, match_widen_vd_neq_vs2_neq_vm, 0},
+
+{"vfmadd.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFMADDVV, MASK_VFMADDVV, match_vd_neq_vm, 0},
+{"vfmadd.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFMADDVF, MASK_VFMADDVF, match_vd_neq_vm, 0},
+{"vfnmadd.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFNMADDVV, MASK_VFNMADDVV, match_vd_neq_vm, 0},
+{"vfnmadd.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFNMADDVF, MASK_VFNMADDVF, match_vd_neq_vm, 0},
+{"vfmsub.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFMSUBVV, MASK_VFMSUBVV, match_vd_neq_vm, 0},
+{"vfmsub.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFMSUBVF, MASK_VFMSUBVF, match_vd_neq_vm, 0},
+{"vfnmsub.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFNMSUBVV, MASK_VFNMSUBVV, match_vd_neq_vm, 0},
+{"vfnmsub.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFNMSUBVF, MASK_VFNMSUBVF, match_vd_neq_vm, 0},
+{"vfmacc.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFMACCVV, MASK_VFMACCVV, match_vd_neq_vm, 0},
+{"vfmacc.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFMACCVF, MASK_VFMACCVF, match_vd_neq_vm, 0},
+{"vfnmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFNMACCVV, MASK_VFNMACCVV, match_vd_neq_vm, 0},
+{"vfnmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFNMACCVF, MASK_VFNMACCVF, match_vd_neq_vm, 0},
+{"vfmsac.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFMSACVV, MASK_VFMSACVV, match_vd_neq_vm, 0},
+{"vfmsac.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFMSACVF, MASK_VFMSACVF, match_vd_neq_vm, 0},
+{"vfnmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFNMSACVV, MASK_VFNMSACVV, match_vd_neq_vm, 0},
+{"vfnmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFNMSACVF, MASK_VFNMSACVF, match_vd_neq_vm, 0},
+
+{"vfwmacc.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMACCVV, MASK_VFWMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmacc.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMACCVF, MASK_VFWMACCVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwnmacc.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMACCVV, MASK_VFWNMACCVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwnmacc.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMACCVF, MASK_VFWNMACCVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwmsac.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWMSACVV, MASK_VFWMSACVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwmsac.vf",  0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWMSACVF, MASK_VFWMSACVF, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwnmsac.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VFWNMSACVV, MASK_VFWNMSACVV, match_widen_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vfwnmsac.vf", 0, INSN_CLASS_V_AND_F, "Vd,S,VtVm", MATCH_VFWNMSACVF, MASK_VFWNMSACVF, match_widen_vd_neq_vs2_neq_vm, 0},
+
+{"vfsqrt.v",   0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFSQRTV, MASK_VFSQRTV, match_vd_neq_vm, 0},
+{"vfrsqrt7.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFRSQRT7V, MASK_VFRSQRT7V, match_vd_neq_vm, 0},
+{"vfrsqrte7.v",0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFRSQRT7V, MASK_VFRSQRT7V, match_vd_neq_vm, 0},
+{"vfrec7.v",   0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFREC7V, MASK_VFREC7V, match_vd_neq_vm, 0},
+{"vfrece7.v",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFREC7V, MASK_VFREC7V, match_vd_neq_vm, 0},
+{"vfclass.v",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCLASSV, MASK_VFCLASSV, match_vd_neq_vm, 0},
+
+{"vfmin.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFMINVV, MASK_VFMINVV, match_vd_neq_vm, 0},
+{"vfmin.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFMINVF, MASK_VFMINVF, match_vd_neq_vm, 0},
+{"vfmax.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFMAXVV, MASK_VFMAXVV, match_vd_neq_vm, 0},
+{"vfmax.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFMAXVF, MASK_VFMAXVF, match_vd_neq_vm, 0},
+
+{"vfneg.v",    0, INSN_CLASS_V_AND_F, "Vd,VuVm", MATCH_VFSGNJNVV, MASK_VFSGNJNVV, match_vs1_eq_vs2_neq_vm, INSN_ALIAS },
+
+{"vfsgnj.vv",  0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFSGNJVV, MASK_VFSGNJVV, match_vd_neq_vm, 0},
+{"vfsgnj.vf",  0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSGNJVF, MASK_VFSGNJVF, match_vd_neq_vm, 0},
+{"vfsgnjn.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFSGNJNVV, MASK_VFSGNJNVV, match_vd_neq_vm, 0},
+{"vfsgnjn.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSGNJNVF, MASK_VFSGNJNVF, match_vd_neq_vm, 0},
+{"vfsgnjx.vv", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFSGNJXVV, MASK_VFSGNJXVV, match_vd_neq_vm, 0},
+{"vfsgnjx.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSGNJXVF, MASK_VFSGNJXVF, match_vd_neq_vm, 0},
+
+{"vmfeq.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VMFEQVV, MASK_VMFEQVV, match_opcode, 0},
+{"vmfeq.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFEQVF, MASK_VMFEQVF, match_opcode, 0},
+{"vmfne.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VMFNEVV, MASK_VMFNEVV, match_opcode, 0},
+{"vmfne.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFNEVF, MASK_VMFNEVF, match_opcode, 0},
+{"vmflt.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VMFLTVV, MASK_VMFLTVV, match_opcode, 0},
+{"vmflt.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFLTVF, MASK_VMFLTVF, match_opcode, 0},
+{"vmfle.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VMFLEVV, MASK_VMFLEVV, match_opcode, 0},
+{"vmfle.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFLEVF, MASK_VMFLEVF, match_opcode, 0},
+{"vmfgt.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFGTVF, MASK_VMFGTVF, match_opcode, 0},
+{"vmfge.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VMFGEVF, MASK_VMFGEVF, match_opcode, 0},
+
+/* These aliases are for assembly but not disassembly.  */
+{"vmfgt.vv",    0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VMFLTVV, MASK_VMFLTVV, match_opcode, INSN_ALIAS},
+{"vmfge.vv",   0, INSN_CLASS_V_AND_F, "Vd,Vs,VtVm", MATCH_VMFLEVV, MASK_VMFLEVV, match_opcode, INSN_ALIAS},
+
+{"vfmerge.vfm",0, INSN_CLASS_V_AND_F, "Vd,Vt,S,V0", MATCH_VFMERGEVFM, MASK_VFMERGEVFM, match_opcode, 0},
+{"vfmv.v.f",   0, INSN_CLASS_V_AND_F, "Vd,S", MATCH_VFMVVF, MASK_VFMVVF, match_opcode, 0 },
+
+{"vfcvt.xu.f.v",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTXUFV, MASK_VFCVTXUFV, match_vd_neq_vm, 0},
+{"vfcvt.x.f.v",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTXFV, MASK_VFCVTXFV, match_vd_neq_vm, 0},
+{"vfcvt.rtz.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTRTZXUFV, MASK_VFCVTRTZXUFV, match_vd_neq_vm, 0},
+{"vfcvt.rtz.x.f.v",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTRTZXFV, MASK_VFCVTRTZXFV, match_vd_neq_vm, 0},
+{"vfcvt.f.xu.v",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTFXUV, MASK_VFCVTFXUV, match_vd_neq_vm, 0},
+{"vfcvt.f.x.v",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFCVTFXV, MASK_VFCVTFXV, match_vd_neq_vm, 0},
+
+{"vfwcvt.xu.f.v",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXUFV, MASK_VFWCVTXUFV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.x.f.v",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTXFV, MASK_VFWCVTXFV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.rtz.xu.f.v", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXUFV, MASK_VFWCVTRTZXUFV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.rtz.x.f.v",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTRTZXFV, MASK_VFWCVTRTZXFV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.xu.v",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXUV, MASK_VFWCVTFXUV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.x.v",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFXV, MASK_VFWCVTFXV, match_widen_vd_neq_vs2_neq_vm, 0},
+{"vfwcvt.f.f.v",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFWCVTFFV, MASK_VFWCVTFFV, match_widen_vd_neq_vs2_neq_vm, 0},
+
+{"vfncvt.xu.f.w",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXUFW, MASK_VFNCVTXUFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.x.f.w",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTXFW, MASK_VFNCVTXFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.rtz.xu.f.w", 0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXUFW, MASK_VFNCVTRTZXUFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.rtz.x.f.w",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRTZXFW, MASK_VFNCVTRTZXFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.f.xu.w",     0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXUW, MASK_VFNCVTFXUW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.f.x.w",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFXW, MASK_VFNCVTFXW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.f.f.w",      0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTFFW, MASK_VFNCVTFFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+{"vfncvt.rod.f.f.w",  0, INSN_CLASS_V_AND_F, "Vd,VtVm", MATCH_VFNCVTRODFFW, MASK_VFNCVTRODFFW, match_narrow_vd_neq_vs2_neq_vm, 0},
+
+{"vredsum.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDSUMVS, MASK_VREDSUMVS, match_opcode, 0},
+{"vredmaxu.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDMAXUVS, MASK_VREDMAXUVS, match_opcode, 0},
+{"vredmax.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDMAXVS, MASK_VREDMAXVS, match_opcode, 0},
+{"vredminu.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDMINUVS, MASK_VREDMINUVS, match_opcode, 0},
+{"vredmin.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDMINVS, MASK_VREDMINVS, match_opcode, 0},
+{"vredand.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDANDVS, MASK_VREDANDVS, match_opcode, 0},
+{"vredor.vs",  0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDORVS, MASK_VREDORVS, match_opcode, 0},
+{"vredxor.vs", 0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VREDXORVS, MASK_VREDXORVS, match_opcode, 0},
+
+{"vwredsumu.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWREDSUMUVS, MASK_VWREDSUMUVS, match_opcode, 0},
+{"vwredsum.vs",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VWREDSUMVS, MASK_VWREDSUMVS, match_opcode, 0},
+
+{"vfredosum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDOSUMVS, MASK_VFREDOSUMVS, match_opcode, 0},
+{"vfredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDSUMVS, MASK_VFREDSUMVS, match_opcode, 0},
+{"vfredmax.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDMAXVS, MASK_VFREDMAXVS, match_opcode, 0},
+{"vfredmin.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFREDMINVS, MASK_VFREDMINVS, match_opcode, 0},
+
+{"vfwredosum.vs",0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDOSUMVS, MASK_VFWREDOSUMVS, match_opcode, 0},
+{"vfwredsum.vs", 0, INSN_CLASS_V_AND_F, "Vd,Vt,VsVm", MATCH_VFWREDSUMVS, MASK_VFWREDSUMVS, match_opcode, 0},
+
+{"vmmv.m",     0, INSN_CLASS_V, "Vd,Vu", MATCH_VMANDMM, MASK_VMANDMM, match_vs1_eq_vs2, INSN_ALIAS},
+{"vmcpy.m",    0, INSN_CLASS_V, "Vd,Vu", MATCH_VMANDMM, MASK_VMANDMM, match_vs1_eq_vs2, INSN_ALIAS},
+{"vmclr.m",    0, INSN_CLASS_V, "Vv", MATCH_VMXORMM, MASK_VMXORMM, match_vd_eq_vs1_eq_vs2, INSN_ALIAS},
+{"vmset.m",    0, INSN_CLASS_V, "Vv", MATCH_VMXNORMM, MASK_VMXNORMM, match_vd_eq_vs1_eq_vs2, INSN_ALIAS},
+{"vmnot.m",    0, INSN_CLASS_V, "Vd,Vu", MATCH_VMNANDMM, MASK_VMNANDMM, match_vs1_eq_vs2, INSN_ALIAS},
+
+{"vmand.mm",   0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDMM, MASK_VMANDMM, match_opcode, 0},
+{"vmnand.mm",  0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMNANDMM, MASK_VMNANDMM, match_opcode, 0},
+{"vmandnot.mm",0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMANDNOTMM, MASK_VMANDNOTMM, match_opcode, 0},
+{"vmxor.mm",   0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMXORMM, MASK_VMXORMM, match_opcode, 0},
+{"vmor.mm",    0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORMM, MASK_VMORMM, match_opcode, 0},
+{"vmnor.mm",   0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMNORMM, MASK_VMNORMM, match_opcode, 0},
+{"vmornot.mm", 0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMORNOTMM, MASK_VMORNOTMM, match_opcode, 0},
+{"vmxnor.mm",  0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VMXNORMM, MASK_VMXNORMM, match_opcode, 0},
+
+{"vpopc.m",    0, INSN_CLASS_V, "d,VtVm", MATCH_VPOPCM, MASK_VPOPCM, match_opcode, 0},
+{"vfirst.m",   0, INSN_CLASS_V, "d,VtVm", MATCH_VFIRSTM, MASK_VFIRSTM, match_opcode, 0},
+{"vmsbf.m",    0, INSN_CLASS_V, "Vd,VtVm", MATCH_VMSBFM, MASK_VMSBFM, match_vd_neq_vs2_neq_vm, 0},
+{"vmsif.m",    0, INSN_CLASS_V, "Vd,VtVm", MATCH_VMSIFM, MASK_VMSIFM, match_vd_neq_vs2_neq_vm, 0},
+{"vmsof.m",    0, INSN_CLASS_V, "Vd,VtVm", MATCH_VMSOFM, MASK_VMSOFM, match_vd_neq_vs2_neq_vm, 0},
+{"viota.m",    0, INSN_CLASS_V, "Vd,VtVm", MATCH_VIOTAM, MASK_VIOTAM, match_vd_neq_vs2_neq_vm, 0},
+{"vid.v",      0, INSN_CLASS_V, "VdVm", MATCH_VIDV, MASK_VIDV, match_vd_neq_vm, 0},
+
+{"vmv.x.s",    0, INSN_CLASS_V, "d,Vt", MATCH_VMVXS, MASK_VMVXS, match_opcode, 0},
+{"vmv.s.x",    0, INSN_CLASS_V, "Vd,s", MATCH_VMVSX, MASK_VMVSX, match_opcode, 0},
+
+{"vfmv.f.s",   0, INSN_CLASS_V_AND_F, "D,Vt", MATCH_VFMVFS, MASK_VFMVFS, match_opcode, 0},
+{"vfmv.s.f",   0, INSN_CLASS_V_AND_F, "Vd,S", MATCH_VFMVSF, MASK_VFMVSF, match_opcode, 0},
+
+{"vslideup.vx",0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSLIDEUPVX, MASK_VSLIDEUPVX, match_vd_neq_vs2_neq_vm, 0},
+{"vslideup.vi",0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VSLIDEUPVI, MASK_VSLIDEUPVI, match_vd_neq_vs2_neq_vm, 0},
+{"vslidedown.vx",0,INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSLIDEDOWNVX, MASK_VSLIDEDOWNVX, match_vd_neq_vm, 0},
+{"vslidedown.vi",0,INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VSLIDEDOWNVI, MASK_VSLIDEDOWNVI, match_vd_neq_vm, 0},
+
+{"vslide1up.vx",    0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSLIDE1UPVX, MASK_VSLIDE1UPVX, match_vd_neq_vs2_neq_vm, 0},
+{"vslide1down.vx",  0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VSLIDE1DOWNVX, MASK_VSLIDE1DOWNVX, match_vd_neq_vm, 0},
+{"vfslide1up.vf",   0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSLIDE1UPVF, MASK_VFSLIDE1UPVF, match_vd_neq_vs2_neq_vm, 0},
+{"vfslide1down.vf", 0, INSN_CLASS_V_AND_F, "Vd,Vt,SVm", MATCH_VFSLIDE1DOWNVF, MASK_VFSLIDE1DOWNVF, match_vd_neq_vm, 0},
+
+{"vrgather.vv",    0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VRGATHERVV, MASK_VRGATHERVV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+{"vrgather.vx",    0, INSN_CLASS_V, "Vd,Vt,sVm", MATCH_VRGATHERVX, MASK_VRGATHERVX, match_vd_neq_vs2_neq_vm, 0},
+{"vrgather.vi",    0, INSN_CLASS_V, "Vd,Vt,VjVm", MATCH_VRGATHERVI, MASK_VRGATHERVI, match_vd_neq_vs2_neq_vm, 0},
+{"vrgatherei16.vv",0, INSN_CLASS_V, "Vd,Vt,VsVm", MATCH_VRGATHEREI16VV, MASK_VRGATHEREI16VV, match_vd_neq_vs1_neq_vs2_neq_vm, 0},
+
+{"vcompress.vm",0, INSN_CLASS_V, "Vd,Vt,Vs", MATCH_VCOMPRESSVM, MASK_VCOMPRESSVM, match_vd_neq_vs1_neq_vs2, 0},
+
+{"vmv1r.v",    0, INSN_CLASS_V, "Vd,Vt", MATCH_VMV1RV, MASK_VMV1RV, match_vmv_nf_rv, 0},
+{"vmv2r.v",    0, INSN_CLASS_V, "Vd,Vt", MATCH_VMV2RV, MASK_VMV2RV, match_vmv_nf_rv, 0},
+{"vmv4r.v",    0, INSN_CLASS_V, "Vd,Vt", MATCH_VMV4RV, MASK_VMV4RV, match_vmv_nf_rv, 0},
+{"vmv8r.v",    0, INSN_CLASS_V, "Vd,Vt", MATCH_VMV8RV, MASK_VMV8RV, match_vmv_nf_rv, 0},
+
+/* Terminate the list.  */
+{0, 0, INSN_CLASS_NONE, 0, 0, 0, 0, 0 },
+};
+
 /* The supported extended extensions.  */
 const struct riscv_opcode *riscv_extended_opcodes[] =
 {
+  riscv_draft_opcodes,
   NULL
 };