Iain Buclaw [Sun, 22 Mar 2020 12:11:10 +0000 (13:11 +0100)]
d: Generate phony targets for content imported files (PR93038)
This is in addition to the last change which started including them in
the make dependency list.
gcc/d/ChangeLog:
2020-03-22 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/93038
* d-lang.cc (deps_write): Generate phony targets for content imported
files.
gcc/testsuite/ChangeLog:
2020-03-22 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/93038
* gdc.dg/pr93038b.d: New test.
Iain Sandoe [Sun, 22 Mar 2020 09:27:05 +0000 (09:27 +0000)]
Darwin: Fix i686 bootstrap when the assembler supports GOTOFF in data.
When we use an assembler that supports " .long XX@GOTOFF", the current
combination of configuration parameters and conditional compilation
(when building an i686-darwin compiler with mdynamic-no-pic) assume that
it's OK to put jump tables in the .const section.
However, when we encounter a weak function with a jump table, this
produces relocations that directly access the weak symbol section from
the .const section - which is deemed illegal by the linker (since that
would mean that the weak symbol could not be replaced).
Arguably, this is a limitation (maybe even a bug) in the linker - but
it seems that we'd have to change the ABI to fix it - since it would
require some annotation (maybe just using a special section for the
jump tables) to tell the linker that this specific circumstance is OK
because the direct access to the weak symbol can only occur from that
symbol itself.
The fix is to force jump tables into the text section for all X86 Darwin
versions (PIC code already had this change).
gcc/ChangeLog:
2020-03-22 Iain Sandoe <iain@sandoe.co.uk>
* config/i386/darwin.h (JUMP_TABLES_IN_TEXT_SECTION): Remove
references to Darwin.
* config/i386/i386.h (JUMP_TABLES_IN_TEXT_SECTION): Define this
unconditionally and comment on why.
Iain Sandoe [Fri, 20 Mar 2020 21:27:13 +0000 (21:27 +0000)]
testsuite: Fix lambda-vis.C for targets with user label prefix '_'.
This prepends an optional match for the additional USER_LABEL_PREFIX
to the scan assembler checks.
2020-03-22 Iain Sandoe <iain@sandoe.co.uk>
* g++.dg/abi/lambda-vis.C: Amend assembler match
strings for targets using a USER_LABEL_PREFIX.
Iain Buclaw [Sat, 21 Mar 2020 23:10:17 +0000 (00:10 +0100)]
d: Fix missing dependencies in depfile for imported files (PR93038)
A new field for tracking imported files was added to the front-end, this
makes use of it by writing all such files in the make dependency list.
gcc/d/ChangeLog:
2020-03-22 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/93038
* d-lang.cc (deps_write): Add content imported files to the make
dependency list.
gcc/testsuite/ChangeLog:
2020-03-22 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/93038
* gdc.dg/fileimports/pr93038.txt: New test.
* gdc.dg/pr93038.d: New test.
Iain Buclaw [Sat, 21 Mar 2020 23:07:45 +0000 (00:07 +0100)]
d: Fix typo in ChangeLog for last change
Jonathan Wakely [Sat, 21 Mar 2020 22:11:44 +0000 (22:11 +0000)]
libstdc++: Fix experimental::path::generic_string (PR 93245)
This function was unimplemented, simply returning the native format
string instead.
PR libstdc++/93245
* include/experimental/bits/fs_path.h (path::generic_string<C,T,A>()):
* testsuite/experimental/filesystem/path/generic/generic_string.cc:
Improve test coverage.
Jonathan Wakely [Sat, 21 Mar 2020 21:51:07 +0000 (21:51 +0000)]
libstdc++: Fix path::generic_string allocator handling (PR 94242)
It's not possible to construct a path::string_type from an allocator of
a different type. Create the correct specialization of basic_string, and
adjust path::_S_str_convert to use a basic_string_view so that it is
independent of the allocator type.
PR libstdc++/94242
* include/bits/fs_path.h (path::_S_str_convert): Replace first
parameter with basic_string_view so that strings with different
allocators can be accepted.
(path::generic_string<C,T,A>()): Use basic_string object that uses the
right allocator type.
* testsuite/27_io/filesystem/path/generic/94242.cc: New test.
* testsuite/27_io/filesystem/path/generic/generic_string.cc: Improve
test coverage.
Iain Sandoe [Sat, 21 Mar 2020 13:20:47 +0000 (13:20 +0000)]
Darwin: Handle NULL DECL_SIZE_TYPE in machopic_select_section (PR94237).
A recent change in the LTO streaming arrangement means that it is
now possible for machopic_select_section () to be called with a NULL
value for DECL_SIZE_TYPE - corresponding to an incomplete or not-yet-
laid out type.
When section anchors are present, and we are generating assembler, we
normally need to know the object size when choosing the section, since
zero-sized objects must be placed in sections that forbid section
anchors.
In the current circumstance, the objective of the earlier streaming of
this data is to allow nm to determine BSS c.f. Data symbols (when used
with the LTO plugin). Since Darwin does not yet make use of the plugin
this fix is a bit of future-proofing. We now emit the 'generic' section
for the decl (absent knowing its size) - which will still be correct in
determining the BSS c.f. Data case.
gcc/ChangeLog:
2020-03-21 Iain Sandoe <iain@sandoe.co.uk>
PR lto/94237
* config/darwin.c (darwin_mergeable_constant_section): Collect
section anchor checks into the caller.
(machopic_select_section): Collect section anchor checks into
the determination of 'effective zero-size' objects. When the
size is unknown, assume it is non-zero, and thus return the
'generic' section for the DECL.
Iain Sandoe [Sun, 1 Mar 2020 13:04:58 +0000 (13:04 +0000)]
Darwin: Address translation comments (PR93694).
This updates the options descriptions after feedback from
a translator.
gcc/ChangeLog:
2020-03-21 Iain Sandoe <iain@sandoe.co.uk>
PR target/93694
* gcc/config/darwin.opt: Amend options descriptions.
Iain Buclaw [Sat, 21 Mar 2020 10:38:59 +0000 (11:38 +0100)]
d: Fix ICE in add_symbol_to_partition_1, at lto/lto-partition.c:215
This patch addresses two problems with TypeInfo initializer generation.
1. D array fields pointing to compiler generated data are referencing
public symbols with no unique prefix, which can lead to duplicate
definition errors in some hard to reduce cases. To avoid name clashes,
all symbols that are generated for TypeInfo initializers now use the
assembler name of the TypeInfo decl as a prefix.
2. An ICE would occur during LTO pass because these same decls are
considered to be part of the same comdat group as the TypeInfo decl that
it's referred by, despite itself being neither marked public nor comdat.
This resulted in decls being added to the LTRANS partition out of order,
triggering an assert when add_symbol_to_partition_1 attempted to add
them again. To remedy, TREE_PUBLIC and DECL_COMDAT are now set on all
generated symbols.
gcc/d/ChangeLog:
2020-03-21 Iain Buclaw <ibuclaw@gdcproject.org>
PR d/94290
* typeinfo.cc (class TypeInfoVisitor): Replace type_ field with decl_.
(TypeInfoVisitor::TypeInfoVisitor): Set decl_.
(TypeInfoVisitor::result): Update.
(TypeInfoVisitor::internal_reference): New function.
(TypeInfoVisitor::layout_string): Use internal_reference.
(TypeInfoVisitor::visit (TypeInfoTupleDeclaration *)): Likewise.
(layout_typeinfo): Construct TypeInfoVisitor with typeinfo decl.
(layout_classinfo): Likewise.
Patrick Palka [Thu, 19 Mar 2020 13:58:28 +0000 (09:58 -0400)]
c++: Reject changing active member of union during initialization [PR94066]
This patch adds a check to detect changing the active union member during
initialization of another member of the union in cxx_eval_store_expression. It
uses the CONSTRUCTOR_NO_CLEARING flag as a proxy for whether the non-empty
CONSTRUCTOR of UNION_TYPE we're assigning to is in the process of being
initialized.
This patch additionally fixes an issue in reduced_constant_expression_p where we
were returning false for an uninitialized union with no active member. This
lets us correctly reject the uninitialized use in the testcase
testconstexpr-union4.C that we weren't before.
gcc/cp/ChangeLog:
PR c++/94066
* constexpr.c (reduced_constant_expression_p) [CONSTRUCTOR]: Properly
handle unions without an initializer.
(cxx_eval_component_reference): Emit a different diagnostic when the
constructor element corresponding to a union member is NULL.
(cxx_eval_bare_aggregate): When constructing a union, always set the
active union member before evaluating the initializer. Relax assertion
that verifies the index of the constructor element we're initializing
hasn't been changed.
(cxx_eval_store_expression): Diagnose changing the active union member
while the union is in the process of being initialized. After setting
an active union member, clear CONSTRUCTOR_NO_CLEARING on the underlying
CONSTRUCTOR.
(cxx_eval_constant_expression) [PLACEHOLDER_EXPR]: Don't re-reduce a
CONSTRUCTOR returned by lookup_placeholder.
gcc/testsuite/ChangeLog:
PR c++/94066
* g++.dg/cpp1y/constexpr-union2.C: New test.
* g++.dg/cpp1y/constexpr-union3.C: New test.
* g++.dg/cpp1y/constexpr-union4.C: New test.
* g++.dg/cpp1y/constexpr-union5.C: New test.
* g++.dg/cpp1y/pr94066.C: New test.
* g++.dg/cpp1y/pr94066-2.C: New test.
* g++.dg/cpp1y/pr94066-3.C: New test.
* g++.dg/cpp2a/constexpr-union1.C: New test.
Richard Sandiford [Mon, 9 Mar 2020 19:42:57 +0000 (19:42 +0000)]
lra: Tighten check for reloading paradoxical subregs [PR94052]
simplify_operand_subreg tries to detect whether the allocation for
a pseudo in a paradoxical subreg is also valid for the outer mode.
The condition it used to check for an invalid combination was:
else if (REG_P (reg)
&& REGNO (reg) >= FIRST_PSEUDO_REGISTER
&& (hard_regno = lra_get_regno_hard_regno (REGNO (reg))) >= 0
&& (hard_regno_nregs (hard_regno, innermode)
< hard_regno_nregs (hard_regno, mode))
&& (regclass = lra_get_allocno_class (REGNO (reg)))
&& (type != OP_IN
|| !in_hard_reg_set_p (reg_class_contents[regclass],
mode, hard_regno)
|| overlaps_hard_reg_set_p (lra_no_alloc_regs,
mode, hard_regno)))
I think there are two problems with this:
(1) It never actually checks whether the hard register is valid for the
outer mode (in the hard_regno_mode_ok sense). If it isn't, any attempt
to reload in the outer mode is likely to cycle, because the implied
regno/mode combination will be just as invalid next time
curr_insn_transform sees the subreg.
(2) The check is valid for little-endian only. For big-endian we need
to move hard_regno backwards.
Using simplify_subreg_regno should avoid both problems.
As the existing comment says, IRA should always take subreg references
into account when allocating hard registers, so this fix-up should only
really be needed for pseudos allocated by LRA itself.
gcc/
2020-03-21 Richard Sandiford <richard.sandiford@arm.com>
PR rtl-optimization/94052
* lra-constraints.c (simplify_operand_subreg): Reload the inner
register of a paradoxical subreg if simplify_subreg_regno fails
to give a valid hard register for the outer mode.
gcc/testsuite/
2020-03-21 Tamar Christina <tamar.christina@arm.com>
PR target/94052
* gcc.target/aarch64/pr94052.C: New test.
Martin Liska [Sat, 21 Mar 2020 07:09:02 +0000 (08:09 +0100)]
Fix comma at end of enumerator list seen with -std=c++98.
* plugin-api.h (enum ld_plugin_symbol_type): Remove
comma after last value of an enum.
* lto-symtab.h (enum gcc_plugin_symbol_type): Likewise.
GCC Administrator [Sat, 21 Mar 2020 00:16:22 +0000 (00:16 +0000)]
Daily bump.
Martin Jambor [Fri, 20 Mar 2020 23:21:02 +0000 (00:21 +0100)]
sra: Cap number of sub-access propagations with a param (PR 93435)
PR 93435 is a perfect SRA bomb. It initializes an array of 16 chars
element-wise, then uses that to initialize an aggregate that consists
of four such arrays, that one to initialize one four times as big as
the previous one all the way to an aggregate that has 64kb.
This causes the sub-access propagation across assignments to create
thousands of byte-sized artificial accesses which are then eligible to
be replaced - they do facilitate forward propagation but there is
enough of them for DSE to never finish.
This patch avoids that situation by accounting how many of such
replacements can be created per SRA candidate. The default value of
32 was just the largest power of two that did not slow down
compilation of the testcase, but it should also hopefully be big
enough for any reasonable input that might rely on the optimization.
2020-03-20 Martin Jambor <mjambor@suse.cz>
PR tree-optimization/93435
* params.opt (sra-max-propagations): New parameter.
* tree-sra.c (propagation_budget): New variable.
(budget_for_propagation_access): New function.
(propagate_subaccesses_from_rhs): Use it.
(propagate_subaccesses_from_lhs): Likewise.
(propagate_all_subaccesses): Set up and destroy propagation_budget.
gcc/testsuite/
* gcc.dg/tree-ssa/pr93435.c: New test.
Joseph Myers [Fri, 20 Mar 2020 23:19:02 +0000 (23:19 +0000)]
Regenerate gcc.pot.
* gcc.pot: Regenerate.
Carl Love [Fri, 20 Mar 2020 23:15:05 +0000 (18:15 -0500)]
rs6000: Add command line and builtin compatibility check
2020-03-20 Carl Love <cel@us.ibm.com>
PR/target 87583
* gcc/config/rs6000/rs6000.c (rs6000_option_override_internal):
Add check for TARGET_FPRND for Power 7 or newer.
Jan Hubicka [Fri, 20 Mar 2020 21:06:24 +0000 (22:06 +0100)]
Fix verifier ICE on wrong comdat local flag [PR93347]
gcc/ChangeLog:
2020-03-20 Jan Hubicka <hubicka@ucw.cz>
PR ipa/93347
* cgraph.c (symbol_table::create_edge): Update calls_comdat_local flag.
(cgraph_edge::redirect_callee): Move here; likewise.
(cgraph_node::remove_callees): Update calls_comdat_local flag.
(cgraph_node::verify_node): Verify that calls_comdat_local flag match
reality.
(cgraph_node::check_calls_comdat_local_p): New member function.
* cgraph.h (cgraph_node::check_calls_comdat_local_p): Declare.
(cgraph_edge::redirect_callee): Move offline.
* ipa-fnsummary.c (compute_fn_summary): Do not compute
calls_comdat_local flag here.
* ipa-inline-transform.c (inline_call): Fix updating of
calls_comdat_local flag.
* ipa-split.c (split_function): Use true instead of 1 to set the flag.
* symtab.c (symtab_node::add_to_same_comdat_group): Update
calls_comdat_local flag.
gcc/testsuite/ChangeLog:
2020-03-20 Jan Hubicka <hubicka@ucw.cz>
* g++.dg/torture/pr93347.C: New test.
Richard Biener [Fri, 20 Mar 2020 17:57:30 +0000 (18:57 +0100)]
adjust SLP tree dumping
This also dumps the root node we eventually smuggle in.
2020-03-20 Richard Biener <rguenther@suse.de>
* tree-vect-slp.c (vect_analyze_slp_instance): Dump SLP tree
from the possibly modified root.
Patrick Palka [Fri, 20 Mar 2020 17:06:21 +0000 (13:06 -0400)]
c++: Add testcases from PR c++/69694
These testcases are compiling successfully since 7.1.
gcc/testsuite/ChangeLog:
PR c++/69694
* g++.dg/cpp0x/decltype74.C: New test.
* g++.dg/cpp0x/decltype75.C: New test.
Srinath Parvathaneni [Fri, 20 Mar 2020 16:56:23 +0000 (16:56 +0000)]
[ARM][GCC][11x]: MVE ACLE vector interleaving store and deinterleaving load intrinsics and also aliases to vstr and vldr intrinsics.
This patch supports following MVE ACLE intrinsics which are aliases of vstr and
vldr intrinsics.
vst1q_p_u8, vst1q_p_s8, vld1q_z_u8, vld1q_z_s8, vst1q_p_u16, vst1q_p_s16,
vld1q_z_u16, vld1q_z_s16, vst1q_p_u32, vst1q_p_s32, vld1q_z_u32, vld1q_z_s32,
vld1q_z_f16, vst1q_p_f16, vld1q_z_f32, vst1q_p_f32.
This patch also supports following MVE ACLE vector deinterleaving loads and vector
interleaving stores.
vst2q_s8, vst2q_u8, vld2q_s8, vld2q_u8, vld4q_s8, vld4q_u8, vst2q_s16, vst2q_u16,
vld2q_s16, vld2q_u16, vld4q_s16, vld4q_u16, vst2q_s32, vst2q_u32, vld2q_s32,
vld2q_u32, vld4q_s32, vld4q_u32, vld4q_f16, vld2q_f16, vst2q_f16, vld4q_f32,
vld2q_f32, vst2q_f32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/arm_mve.h (vst1q_p_u8): Define macro.
(vst1q_p_s8): Likewise.
(vst2q_s8): Likewise.
(vst2q_u8): Likewise.
(vld1q_z_u8): Likewise.
(vld1q_z_s8): Likewise.
(vld2q_s8): Likewise.
(vld2q_u8): Likewise.
(vld4q_s8): Likewise.
(vld4q_u8): Likewise.
(vst1q_p_u16): Likewise.
(vst1q_p_s16): Likewise.
(vst2q_s16): Likewise.
(vst2q_u16): Likewise.
(vld1q_z_u16): Likewise.
(vld1q_z_s16): Likewise.
(vld2q_s16): Likewise.
(vld2q_u16): Likewise.
(vld4q_s16): Likewise.
(vld4q_u16): Likewise.
(vst1q_p_u32): Likewise.
(vst1q_p_s32): Likewise.
(vst2q_s32): Likewise.
(vst2q_u32): Likewise.
(vld1q_z_u32): Likewise.
(vld1q_z_s32): Likewise.
(vld2q_s32): Likewise.
(vld2q_u32): Likewise.
(vld4q_s32): Likewise.
(vld4q_u32): Likewise.
(vld4q_f16): Likewise.
(vld2q_f16): Likewise.
(vld1q_z_f16): Likewise.
(vst2q_f16): Likewise.
(vst1q_p_f16): Likewise.
(vld4q_f32): Likewise.
(vld2q_f32): Likewise.
(vld1q_z_f32): Likewise.
(vst2q_f32): Likewise.
(vst1q_p_f32): Likewise.
(__arm_vst1q_p_u8): Define intrinsic.
(__arm_vst1q_p_s8): Likewise.
(__arm_vst2q_s8): Likewise.
(__arm_vst2q_u8): Likewise.
(__arm_vld1q_z_u8): Likewise.
(__arm_vld1q_z_s8): Likewise.
(__arm_vld2q_s8): Likewise.
(__arm_vld2q_u8): Likewise.
(__arm_vld4q_s8): Likewise.
(__arm_vld4q_u8): Likewise.
(__arm_vst1q_p_u16): Likewise.
(__arm_vst1q_p_s16): Likewise.
(__arm_vst2q_s16): Likewise.
(__arm_vst2q_u16): Likewise.
(__arm_vld1q_z_u16): Likewise.
(__arm_vld1q_z_s16): Likewise.
(__arm_vld2q_s16): Likewise.
(__arm_vld2q_u16): Likewise.
(__arm_vld4q_s16): Likewise.
(__arm_vld4q_u16): Likewise.
(__arm_vst1q_p_u32): Likewise.
(__arm_vst1q_p_s32): Likewise.
(__arm_vst2q_s32): Likewise.
(__arm_vst2q_u32): Likewise.
(__arm_vld1q_z_u32): Likewise.
(__arm_vld1q_z_s32): Likewise.
(__arm_vld2q_s32): Likewise.
(__arm_vld2q_u32): Likewise.
(__arm_vld4q_s32): Likewise.
(__arm_vld4q_u32): Likewise.
(__arm_vld4q_f16): Likewise.
(__arm_vld2q_f16): Likewise.
(__arm_vld1q_z_f16): Likewise.
(__arm_vst2q_f16): Likewise.
(__arm_vst1q_p_f16): Likewise.
(__arm_vld4q_f32): Likewise.
(__arm_vld2q_f32): Likewise.
(__arm_vld1q_z_f32): Likewise.
(__arm_vst2q_f32): Likewise.
(__arm_vst1q_p_f32): Likewise.
(vld1q_z): Define polymorphic variant.
(vld2q): Likewise.
(vld4q): Likewise.
(vst1q_p): Likewise.
(vst2q): Likewise.
* config/arm/arm_mve_builtins.def (STORE1): Use builtin qualifier.
(LOAD1): Likewise.
* config/arm/mve.md (mve_vst2q<mode>): Define RTL pattern.
(mve_vld2q<mode>): Likewise.
(mve_vld4q<mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/mve/intrinsics/vld1q_z_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vld1q_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_z_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld2q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld4q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst2q_u8.c: Likewise.
Iain Buclaw [Fri, 20 Mar 2020 16:26:29 +0000 (17:26 +0100)]
d: Fix SEGV in hash_table<odr_name_hasher, false, xcallocator>::find_slot_with_hash
This patch fixes LTO bug with the D front-end. As DECL_ASSEMBLER_NAME
is set on the TYPE_DECL, so TYPE_CXX_ODR_P must also be set on the type.
The addition of merge_aggregate_types is not strictly needed now, but it
fixes a problem introduced in newer versions of the dmd front-end where
templated types could be sent more than once to the D code generator.
gcc/d/ChangeLog:
2020-03-20 Iain Buclaw <ibuclaw@gdcproject.org>
PR lto/91027
* d-tree.h (struct GTY): Add daggregate field.
(IDENTIFIER_DAGGREGATE): Define.
(d_mangle_decl): Add declaration.
* decl.cc (mangle_decl): Remove static linkage, rename to...
(d_mangle_decl): ...this, update all callers.
* types.cc (merge_aggregate_types): New function.
(TypeVisitor::visit (TypeStruct *)): Call merge_aggregate_types, set
IDENTIFIER_DAGGREGATE and TYPE_CXX_ODR_P.
(TypeVisitor::visit (TypeClass *)): Likewise.
Richard Sandiford [Tue, 17 Mar 2020 14:43:08 +0000 (14:43 +0000)]
c-family: Tighten vector handling in type_for_mode [PR94072]
In this PR we had a 512-bit VECTOR_TYPE whose mode is XImode
(an integer mode used for four 128-bit vectors). When trying
to expand a zero constant for it, we hit code in expand_expr_real_1
that tries to use the associated integer type instead. The code used
type_for_mode (XImode, 1) to get this integer type.
However, the c-family implementation of type_for_mode checks for
any registered built-in type that matches the mode and has the
right signedness. This meant that it could return a built-in
vector type when given an integer mode (particularly if, as here,
the vector type isn't supported by the current subtarget and so
TYPE_MODE != TYPE_MODE_RAW). The expand code would then cycle
endlessly trying to use this "new" type instead of the original
vector type.
2020-03-20 Richard Sandiford <richard.sandiford@arm.com>
gcc/c-family/
PR middle-end/94072
* c-common.c (c_common_type_for_mode): Before using a registered
built-in type, check that the vectorness of the type matches
the vectorness of the mode.
gcc/testsuite/
PR middle-end/94072
* gcc.target/aarch64/pr94072.c: New test.
Srinath Parvathaneni [Fri, 20 Mar 2020 16:06:58 +0000 (16:06 +0000)]
[ARM][GCC][10x]: MVE ACLE intrinsics "add with carry across beats" and "beat-wise substract".
This patch supports following MVE ACLE "add with carry across beats" intrinsics and "beat-wise substract" intrinsics.
vadciq_s32, vadciq_u32, vadciq_m_s32, vadciq_m_u32, vadcq_s32, vadcq_u32, vadcq_m_s32, vadcq_m_u32, vsbciq_s32, vsbciq_u32, vsbciq_m_s32, vsbciq_m_u32, vsbcq_s32, vsbcq_u32, vsbcq_m_s32, vsbcq_m_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/arm-builtins.c (ARM_BUILTIN_GET_FPSCR_NZCVQC): Define.
(ARM_BUILTIN_SET_FPSCR_NZCVQC): Likewise.
(arm_init_mve_builtins): Add "__builtin_arm_get_fpscr_nzcvqc" and
"__builtin_arm_set_fpscr_nzcvqc" to arm_builtin_decls array.
(arm_expand_builtin): Define case ARM_BUILTIN_GET_FPSCR_NZCVQC
and ARM_BUILTIN_SET_FPSCR_NZCVQC.
* config/arm/arm_mve.h (vadciq_s32): Define macro.
(vadciq_u32): Likewise.
(vadciq_m_s32): Likewise.
(vadciq_m_u32): Likewise.
(vadcq_s32): Likewise.
(vadcq_u32): Likewise.
(vadcq_m_s32): Likewise.
(vadcq_m_u32): Likewise.
(vsbciq_s32): Likewise.
(vsbciq_u32): Likewise.
(vsbciq_m_s32): Likewise.
(vsbciq_m_u32): Likewise.
(vsbcq_s32): Likewise.
(vsbcq_u32): Likewise.
(vsbcq_m_s32): Likewise.
(vsbcq_m_u32): Likewise.
(__arm_vadciq_s32): Define intrinsic.
(__arm_vadciq_u32): Likewise.
(__arm_vadciq_m_s32): Likewise.
(__arm_vadciq_m_u32): Likewise.
(__arm_vadcq_s32): Likewise.
(__arm_vadcq_u32): Likewise.
(__arm_vadcq_m_s32): Likewise.
(__arm_vadcq_m_u32): Likewise.
(__arm_vsbciq_s32): Likewise.
(__arm_vsbciq_u32): Likewise.
(__arm_vsbciq_m_s32): Likewise.
(__arm_vsbciq_m_u32): Likewise.
(__arm_vsbcq_s32): Likewise.
(__arm_vsbcq_u32): Likewise.
(__arm_vsbcq_m_s32): Likewise.
(__arm_vsbcq_m_u32): Likewise.
(vadciq_m): Define polymorphic variant.
(vadciq): Likewise.
(vadcq_m): Likewise.
(vadcq): Likewise.
(vsbciq_m): Likewise.
(vsbciq): Likewise.
(vsbcq_m): Likewise.
(vsbcq): Likewise.
* config/arm/arm_mve_builtins.def (BINOP_NONE_NONE_NONE): Use builtin
qualifier.
(BINOP_UNONE_UNONE_UNONE): Likewise.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (VADCIQ): Define iterator.
(VADCIQ_M): Likewise.
(VSBCQ): Likewise.
(VSBCQ_M): Likewise.
(VSBCIQ): Likewise.
(VSBCIQ_M): Likewise.
(VADCQ): Likewise.
(VADCQ_M): Likewise.
(mve_vadciq_m_<supf>v4si): Define RTL pattern.
(mve_vadciq_<supf>v4si): Likewise.
(mve_vadcq_m_<supf>v4si): Likewise.
(mve_vadcq_<supf>v4si): Likewise.
(mve_vsbciq_m_<supf>v4si): Likewise.
(mve_vsbciq_<supf>v4si): Likewise.
(mve_vsbcq_m_<supf>v4si): Likewise.
(mve_vsbcq_<supf>v4si): Likewise.
(get_fpscr_nzcvqc): Define isns.
(set_fpscr_nzcvqc): Define isns.
* config/arm/unspecs.md (UNSPEC_GET_FPSCR_NZCVQC): Define.
(UNSPEC_SET_FPSCR_NZCVQC): Define.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/mve/intrinsics/vadciq_m_s32.c: New test.
* gcc.target/arm/mve/intrinsics/vadciq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadciq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadciq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadcq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadcq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadcq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vadcq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbciq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbciq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbciq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbciq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbcq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbcq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbcq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsbcq_u32.c: Likewise.
Patrick Palka [Wed, 18 Mar 2020 17:57:24 +0000 (13:57 -0400)]
c++: Include the constraint parameter mapping in diagnostic constraint contexts
When diagnosing a constraint error, we currently try to print the constraint
inside a diagnostic constraint context with its template arguments substituted
in. If substitution fails, then we instead just print the dependent form, as in
the test case below:
.../diagnostic6.C:14:15: error: static assertion failed
14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
| ^~~~~~
.../diagnostic6.C:14:15: note: constraints not satisfied
.../diagnostic6.C:4:11: required for the satisfaction of ‘C<T>’
.../diagnostic6.C:8:11: required for the satisfaction of ‘D<typename T::type>’
.../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
But printing just the dependent form sometimes makes it difficult to understand
the underlying failure. In the above example, for instance, there's no
indication of how the template argument 'int' relates to either of the 'T's.
This patch improves the situation by changing these diagnostics to always print
the dependent form of the constraint, and alongside it the (preferably
substituted) constraint parameter mapping. So with the same test case below we
now get:
.../diagnostic6.C:14:15: error: static assertion failed
14 | static_assert(E<int>); // { dg-error "static assertion failed|not a class" }
| ^~~~~~
.../diagnostic6.C:14:15: note: constraints not satisfied
.../diagnostic6.C:4:11: required for the satisfaction of ‘C<T>’ [with T = typename T::type]
.../diagnostic6.C:8:11: required for the satisfaction of ‘D<typename T::type>’ [with T = int]
.../diagnostic6.C:14:15: error: ‘int’ is not a class, struct, or union type
This change arguably makes it easier to figure out what's going on whenever a
constraint fails due to substitution creating an invalid type rather than
failing due to the constraint evaluating to false.
gcc/cp/ChangeLog:
* cxx-pretty-print.c (pp_cxx_parameter_mapping): Make extern. Move
the "[with ]" bits to here from ...
(pp_cxx_atomic_constraint): ... here.
* cxx-pretty-print.h (pp_cxx_parameter_mapping): Declare.
* error.c (rebuild_concept_check): Delete.
(print_concept_check_info): Print the dependent form of the constraint and the
preferably substituted parameter mapping alongside it.
gcc/testsuite/ChangeLog:
* g++.dg/concepts/diagnostic6.C: New test.
Srinath Parvathaneni [Fri, 20 Mar 2020 14:14:35 +0000 (14:14 +0000)]
[ARM][GCC][9x]: MVE ACLE predicated intrinsics with (dont-care) variant.
This patch supports following MVE ACLE predicated intrinsic with `_x` (dont-care) variant.
* ``_x`` (dont-care) which indicates that the false-predicated lanes have undefined values.
These are syntactic sugar for merge intrinsics with a ``vuninitializedq`` inactive parameter.
vabdq_x_f16, vabdq_x_f32, vabdq_x_s16, vabdq_x_s32, vabdq_x_s8, vabdq_x_u16, vabdq_x_u32, vabdq_x_u8,
vabsq_x_f16, vabsq_x_f32, vabsq_x_s16, vabsq_x_s32, vabsq_x_s8, vaddq_x_f16, vaddq_x_f32, vaddq_x_n_f16,
vaddq_x_n_f32, vaddq_x_n_s16, vaddq_x_n_s32, vaddq_x_n_s8, vaddq_x_n_u16, vaddq_x_n_u32, vaddq_x_n_u8,
vaddq_x_s16, vaddq_x_s32, vaddq_x_s8, vaddq_x_u16, vaddq_x_u32, vaddq_x_u8, vandq_x_f16, vandq_x_f32,
vandq_x_s16, vandq_x_s32, vandq_x_s8, vandq_x_u16, vandq_x_u32, vandq_x_u8, vbicq_x_f16, vbicq_x_f32,
vbicq_x_s16, vbicq_x_s32, vbicq_x_s8, vbicq_x_u16, vbicq_x_u32, vbicq_x_u8, vbrsrq_x_n_f16,
vbrsrq_x_n_f32, vbrsrq_x_n_s16, vbrsrq_x_n_s32, vbrsrq_x_n_s8, vbrsrq_x_n_u16, vbrsrq_x_n_u32,
vbrsrq_x_n_u8, vcaddq_rot270_x_f16, vcaddq_rot270_x_f32, vcaddq_rot270_x_s16, vcaddq_rot270_x_s32,
vcaddq_rot270_x_s8, vcaddq_rot270_x_u16, vcaddq_rot270_x_u32, vcaddq_rot270_x_u8, vcaddq_rot90_x_f16,
vcaddq_rot90_x_f32, vcaddq_rot90_x_s16, vcaddq_rot90_x_s32, vcaddq_rot90_x_s8, vcaddq_rot90_x_u16,
vcaddq_rot90_x_u32, vcaddq_rot90_x_u8, vclsq_x_s16, vclsq_x_s32, vclsq_x_s8, vclzq_x_s16, vclzq_x_s32,
vclzq_x_s8, vclzq_x_u16, vclzq_x_u32, vclzq_x_u8, vcmulq_rot180_x_f16, vcmulq_rot180_x_f32,
vcmulq_rot270_x_f16, vcmulq_rot270_x_f32, vcmulq_rot90_x_f16, vcmulq_rot90_x_f32, vcmulq_x_f16,
vcmulq_x_f32, vcvtaq_x_s16_f16, vcvtaq_x_s32_f32, vcvtaq_x_u16_f16, vcvtaq_x_u32_f32, vcvtbq_x_f32_f16,
vcvtmq_x_s16_f16, vcvtmq_x_s32_f32, vcvtmq_x_u16_f16, vcvtmq_x_u32_f32, vcvtnq_x_s16_f16,
vcvtnq_x_s32_f32, vcvtnq_x_u16_f16, vcvtnq_x_u32_f32, vcvtpq_x_s16_f16, vcvtpq_x_s32_f32,
vcvtpq_x_u16_f16, vcvtpq_x_u32_f32, vcvtq_x_f16_s16, vcvtq_x_f16_u16, vcvtq_x_f32_s32, vcvtq_x_f32_u32,
vcvtq_x_n_f16_s16, vcvtq_x_n_f16_u16, vcvtq_x_n_f32_s32, vcvtq_x_n_f32_u32, vcvtq_x_n_s16_f16,
vcvtq_x_n_s32_f32, vcvtq_x_n_u16_f16, vcvtq_x_n_u32_f32, vcvtq_x_s16_f16, vcvtq_x_s32_f32,
vcvtq_x_u16_f16, vcvtq_x_u32_f32, vcvttq_x_f32_f16, vddupq_x_n_u16, vddupq_x_n_u32, vddupq_x_n_u8,
vddupq_x_wb_u16, vddupq_x_wb_u32, vddupq_x_wb_u8, vdupq_x_n_f16, vdupq_x_n_f32, vdupq_x_n_s16,
vdupq_x_n_s32, vdupq_x_n_s8, vdupq_x_n_u16, vdupq_x_n_u32, vdupq_x_n_u8, vdwdupq_x_n_u16, vdwdupq_x_n_u32,
vdwdupq_x_n_u8, vdwdupq_x_wb_u16, vdwdupq_x_wb_u32, vdwdupq_x_wb_u8, veorq_x_f16, veorq_x_f32, veorq_x_s16,
veorq_x_s32, veorq_x_s8, veorq_x_u16, veorq_x_u32, veorq_x_u8, vhaddq_x_n_s16, vhaddq_x_n_s32,
vhaddq_x_n_s8, vhaddq_x_n_u16, vhaddq_x_n_u32, vhaddq_x_n_u8, vhaddq_x_s16, vhaddq_x_s32, vhaddq_x_s8,
vhaddq_x_u16, vhaddq_x_u32, vhaddq_x_u8, vhcaddq_rot270_x_s16, vhcaddq_rot270_x_s32, vhcaddq_rot270_x_s8,
vhcaddq_rot90_x_s16, vhcaddq_rot90_x_s32, vhcaddq_rot90_x_s8, vhsubq_x_n_s16, vhsubq_x_n_s32,
vhsubq_x_n_s8, vhsubq_x_n_u16, vhsubq_x_n_u32, vhsubq_x_n_u8, vhsubq_x_s16, vhsubq_x_s32, vhsubq_x_s8,
vhsubq_x_u16, vhsubq_x_u32, vhsubq_x_u8, vidupq_x_n_u16, vidupq_x_n_u32, vidupq_x_n_u8, vidupq_x_wb_u16,
vidupq_x_wb_u32, vidupq_x_wb_u8, viwdupq_x_n_u16, viwdupq_x_n_u32, viwdupq_x_n_u8, viwdupq_x_wb_u16,
viwdupq_x_wb_u32, viwdupq_x_wb_u8, vmaxnmq_x_f16, vmaxnmq_x_f32, vmaxq_x_s16, vmaxq_x_s32, vmaxq_x_s8,
vmaxq_x_u16, vmaxq_x_u32, vmaxq_x_u8, vminnmq_x_f16, vminnmq_x_f32, vminq_x_s16, vminq_x_s32, vminq_x_s8,
vminq_x_u16, vminq_x_u32, vminq_x_u8, vmovlbq_x_s16, vmovlbq_x_s8, vmovlbq_x_u16, vmovlbq_x_u8,
vmovltq_x_s16, vmovltq_x_s8, vmovltq_x_u16, vmovltq_x_u8, vmulhq_x_s16, vmulhq_x_s32, vmulhq_x_s8,
vmulhq_x_u16, vmulhq_x_u32, vmulhq_x_u8, vmullbq_int_x_s16, vmullbq_int_x_s32, vmullbq_int_x_s8,
vmullbq_int_x_u16, vmullbq_int_x_u32, vmullbq_int_x_u8, vmullbq_poly_x_p16, vmullbq_poly_x_p8,
vmulltq_int_x_s16, vmulltq_int_x_s32, vmulltq_int_x_s8, vmulltq_int_x_u16, vmulltq_int_x_u32,
vmulltq_int_x_u8, vmulltq_poly_x_p16, vmulltq_poly_x_p8, vmulq_x_f16, vmulq_x_f32, vmulq_x_n_f16,
vmulq_x_n_f32, vmulq_x_n_s16, vmulq_x_n_s32, vmulq_x_n_s8, vmulq_x_n_u16, vmulq_x_n_u32, vmulq_x_n_u8,
vmulq_x_s16, vmulq_x_s32, vmulq_x_s8, vmulq_x_u16, vmulq_x_u32, vmulq_x_u8, vmvnq_x_n_s16, vmvnq_x_n_s32,
vmvnq_x_n_u16, vmvnq_x_n_u32, vmvnq_x_s16, vmvnq_x_s32, vmvnq_x_s8, vmvnq_x_u16, vmvnq_x_u32, vmvnq_x_u8,
vnegq_x_f16, vnegq_x_f32, vnegq_x_s16, vnegq_x_s32, vnegq_x_s8, vornq_x_f16, vornq_x_f32, vornq_x_s16,
vornq_x_s32, vornq_x_s8, vornq_x_u16, vornq_x_u32, vornq_x_u8, vorrq_x_f16, vorrq_x_f32, vorrq_x_s16,
vorrq_x_s32, vorrq_x_s8, vorrq_x_u16, vorrq_x_u32, vorrq_x_u8, vrev16q_x_s8, vrev16q_x_u8, vrev32q_x_f16,
vrev32q_x_s16, vrev32q_x_s8, vrev32q_x_u16, vrev32q_x_u8, vrev64q_x_f16, vrev64q_x_f32, vrev64q_x_s16,
vrev64q_x_s32, vrev64q_x_s8, vrev64q_x_u16, vrev64q_x_u32, vrev64q_x_u8, vrhaddq_x_s16, vrhaddq_x_s32,
vrhaddq_x_s8, vrhaddq_x_u16, vrhaddq_x_u32, vrhaddq_x_u8, vrmulhq_x_s16, vrmulhq_x_s32, vrmulhq_x_s8,
vrmulhq_x_u16, vrmulhq_x_u32, vrmulhq_x_u8, vrndaq_x_f16, vrndaq_x_f32, vrndmq_x_f16, vrndmq_x_f32,
vrndnq_x_f16, vrndnq_x_f32, vrndpq_x_f16, vrndpq_x_f32, vrndq_x_f16, vrndq_x_f32, vrndxq_x_f16,
vrndxq_x_f32, vrshlq_x_s16, vrshlq_x_s32, vrshlq_x_s8, vrshlq_x_u16, vrshlq_x_u32, vrshlq_x_u8,
vrshrq_x_n_s16, vrshrq_x_n_s32, vrshrq_x_n_s8, vrshrq_x_n_u16, vrshrq_x_n_u32, vrshrq_x_n_u8,
vshllbq_x_n_s16, vshllbq_x_n_s8, vshllbq_x_n_u16, vshllbq_x_n_u8, vshlltq_x_n_s16, vshlltq_x_n_s8,
vshlltq_x_n_u16, vshlltq_x_n_u8, vshlq_x_n_s16, vshlq_x_n_s32, vshlq_x_n_s8, vshlq_x_n_u16, vshlq_x_n_u32,
vshlq_x_n_u8, vshlq_x_s16, vshlq_x_s32, vshlq_x_s8, vshlq_x_u16, vshlq_x_u32, vshlq_x_u8, vshrq_x_n_s16,
vshrq_x_n_s32, vshrq_x_n_s8, vshrq_x_n_u16, vshrq_x_n_u32, vshrq_x_n_u8, vsubq_x_f16, vsubq_x_f32,
vsubq_x_n_f16, vsubq_x_n_f32, vsubq_x_n_s16, vsubq_x_n_s32, vsubq_x_n_s8, vsubq_x_n_u16, vsubq_x_n_u32,
vsubq_x_n_u8, vsubq_x_s16, vsubq_x_s32, vsubq_x_s8, vsubq_x_u16, vsubq_x_u32, vsubq_x_u8.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vddupq_x_n_u8): Define macro.
(vddupq_x_n_u16): Likewise.
(vddupq_x_n_u32): Likewise.
(vddupq_x_wb_u8): Likewise.
(vddupq_x_wb_u16): Likewise.
(vddupq_x_wb_u32): Likewise.
(vdwdupq_x_n_u8): Likewise.
(vdwdupq_x_n_u16): Likewise.
(vdwdupq_x_n_u32): Likewise.
(vdwdupq_x_wb_u8): Likewise.
(vdwdupq_x_wb_u16): Likewise.
(vdwdupq_x_wb_u32): Likewise.
(vidupq_x_n_u8): Likewise.
(vidupq_x_n_u16): Likewise.
(vidupq_x_n_u32): Likewise.
(vidupq_x_wb_u8): Likewise.
(vidupq_x_wb_u16): Likewise.
(vidupq_x_wb_u32): Likewise.
(viwdupq_x_n_u8): Likewise.
(viwdupq_x_n_u16): Likewise.
(viwdupq_x_n_u32): Likewise.
(viwdupq_x_wb_u8): Likewise.
(viwdupq_x_wb_u16): Likewise.
(viwdupq_x_wb_u32): Likewise.
(vdupq_x_n_s8): Likewise.
(vdupq_x_n_s16): Likewise.
(vdupq_x_n_s32): Likewise.
(vdupq_x_n_u8): Likewise.
(vdupq_x_n_u16): Likewise.
(vdupq_x_n_u32): Likewise.
(vminq_x_s8): Likewise.
(vminq_x_s16): Likewise.
(vminq_x_s32): Likewise.
(vminq_x_u8): Likewise.
(vminq_x_u16): Likewise.
(vminq_x_u32): Likewise.
(vmaxq_x_s8): Likewise.
(vmaxq_x_s16): Likewise.
(vmaxq_x_s32): Likewise.
(vmaxq_x_u8): Likewise.
(vmaxq_x_u16): Likewise.
(vmaxq_x_u32): Likewise.
(vabdq_x_s8): Likewise.
(vabdq_x_s16): Likewise.
(vabdq_x_s32): Likewise.
(vabdq_x_u8): Likewise.
(vabdq_x_u16): Likewise.
(vabdq_x_u32): Likewise.
(vabsq_x_s8): Likewise.
(vabsq_x_s16): Likewise.
(vabsq_x_s32): Likewise.
(vaddq_x_s8): Likewise.
(vaddq_x_s16): Likewise.
(vaddq_x_s32): Likewise.
(vaddq_x_n_s8): Likewise.
(vaddq_x_n_s16): Likewise.
(vaddq_x_n_s32): Likewise.
(vaddq_x_u8): Likewise.
(vaddq_x_u16): Likewise.
(vaddq_x_u32): Likewise.
(vaddq_x_n_u8): Likewise.
(vaddq_x_n_u16): Likewise.
(vaddq_x_n_u32): Likewise.
(vclsq_x_s8): Likewise.
(vclsq_x_s16): Likewise.
(vclsq_x_s32): Likewise.
(vclzq_x_s8): Likewise.
(vclzq_x_s16): Likewise.
(vclzq_x_s32): Likewise.
(vclzq_x_u8): Likewise.
(vclzq_x_u16): Likewise.
(vclzq_x_u32): Likewise.
(vnegq_x_s8): Likewise.
(vnegq_x_s16): Likewise.
(vnegq_x_s32): Likewise.
(vmulhq_x_s8): Likewise.
(vmulhq_x_s16): Likewise.
(vmulhq_x_s32): Likewise.
(vmulhq_x_u8): Likewise.
(vmulhq_x_u16): Likewise.
(vmulhq_x_u32): Likewise.
(vmullbq_poly_x_p8): Likewise.
(vmullbq_poly_x_p16): Likewise.
(vmullbq_int_x_s8): Likewise.
(vmullbq_int_x_s16): Likewise.
(vmullbq_int_x_s32): Likewise.
(vmullbq_int_x_u8): Likewise.
(vmullbq_int_x_u16): Likewise.
(vmullbq_int_x_u32): Likewise.
(vmulltq_poly_x_p8): Likewise.
(vmulltq_poly_x_p16): Likewise.
(vmulltq_int_x_s8): Likewise.
(vmulltq_int_x_s16): Likewise.
(vmulltq_int_x_s32): Likewise.
(vmulltq_int_x_u8): Likewise.
(vmulltq_int_x_u16): Likewise.
(vmulltq_int_x_u32): Likewise.
(vmulq_x_s8): Likewise.
(vmulq_x_s16): Likewise.
(vmulq_x_s32): Likewise.
(vmulq_x_n_s8): Likewise.
(vmulq_x_n_s16): Likewise.
(vmulq_x_n_s32): Likewise.
(vmulq_x_u8): Likewise.
(vmulq_x_u16): Likewise.
(vmulq_x_u32): Likewise.
(vmulq_x_n_u8): Likewise.
(vmulq_x_n_u16): Likewise.
(vmulq_x_n_u32): Likewise.
(vsubq_x_s8): Likewise.
(vsubq_x_s16): Likewise.
(vsubq_x_s32): Likewise.
(vsubq_x_n_s8): Likewise.
(vsubq_x_n_s16): Likewise.
(vsubq_x_n_s32): Likewise.
(vsubq_x_u8): Likewise.
(vsubq_x_u16): Likewise.
(vsubq_x_u32): Likewise.
(vsubq_x_n_u8): Likewise.
(vsubq_x_n_u16): Likewise.
(vsubq_x_n_u32): Likewise.
(vcaddq_rot90_x_s8): Likewise.
(vcaddq_rot90_x_s16): Likewise.
(vcaddq_rot90_x_s32): Likewise.
(vcaddq_rot90_x_u8): Likewise.
(vcaddq_rot90_x_u16): Likewise.
(vcaddq_rot90_x_u32): Likewise.
(vcaddq_rot270_x_s8): Likewise.
(vcaddq_rot270_x_s16): Likewise.
(vcaddq_rot270_x_s32): Likewise.
(vcaddq_rot270_x_u8): Likewise.
(vcaddq_rot270_x_u16): Likewise.
(vcaddq_rot270_x_u32): Likewise.
(vhaddq_x_n_s8): Likewise.
(vhaddq_x_n_s16): Likewise.
(vhaddq_x_n_s32): Likewise.
(vhaddq_x_n_u8): Likewise.
(vhaddq_x_n_u16): Likewise.
(vhaddq_x_n_u32): Likewise.
(vhaddq_x_s8): Likewise.
(vhaddq_x_s16): Likewise.
(vhaddq_x_s32): Likewise.
(vhaddq_x_u8): Likewise.
(vhaddq_x_u16): Likewise.
(vhaddq_x_u32): Likewise.
(vhcaddq_rot90_x_s8): Likewise.
(vhcaddq_rot90_x_s16): Likewise.
(vhcaddq_rot90_x_s32): Likewise.
(vhcaddq_rot270_x_s8): Likewise.
(vhcaddq_rot270_x_s16): Likewise.
(vhcaddq_rot270_x_s32): Likewise.
(vhsubq_x_n_s8): Likewise.
(vhsubq_x_n_s16): Likewise.
(vhsubq_x_n_s32): Likewise.
(vhsubq_x_n_u8): Likewise.
(vhsubq_x_n_u16): Likewise.
(vhsubq_x_n_u32): Likewise.
(vhsubq_x_s8): Likewise.
(vhsubq_x_s16): Likewise.
(vhsubq_x_s32): Likewise.
(vhsubq_x_u8): Likewise.
(vhsubq_x_u16): Likewise.
(vhsubq_x_u32): Likewise.
(vrhaddq_x_s8): Likewise.
(vrhaddq_x_s16): Likewise.
(vrhaddq_x_s32): Likewise.
(vrhaddq_x_u8): Likewise.
(vrhaddq_x_u16): Likewise.
(vrhaddq_x_u32): Likewise.
(vrmulhq_x_s8): Likewise.
(vrmulhq_x_s16): Likewise.
(vrmulhq_x_s32): Likewise.
(vrmulhq_x_u8): Likewise.
(vrmulhq_x_u16): Likewise.
(vrmulhq_x_u32): Likewise.
(vandq_x_s8): Likewise.
(vandq_x_s16): Likewise.
(vandq_x_s32): Likewise.
(vandq_x_u8): Likewise.
(vandq_x_u16): Likewise.
(vandq_x_u32): Likewise.
(vbicq_x_s8): Likewise.
(vbicq_x_s16): Likewise.
(vbicq_x_s32): Likewise.
(vbicq_x_u8): Likewise.
(vbicq_x_u16): Likewise.
(vbicq_x_u32): Likewise.
(vbrsrq_x_n_s8): Likewise.
(vbrsrq_x_n_s16): Likewise.
(vbrsrq_x_n_s32): Likewise.
(vbrsrq_x_n_u8): Likewise.
(vbrsrq_x_n_u16): Likewise.
(vbrsrq_x_n_u32): Likewise.
(veorq_x_s8): Likewise.
(veorq_x_s16): Likewise.
(veorq_x_s32): Likewise.
(veorq_x_u8): Likewise.
(veorq_x_u16): Likewise.
(veorq_x_u32): Likewise.
(vmovlbq_x_s8): Likewise.
(vmovlbq_x_s16): Likewise.
(vmovlbq_x_u8): Likewise.
(vmovlbq_x_u16): Likewise.
(vmovltq_x_s8): Likewise.
(vmovltq_x_s16): Likewise.
(vmovltq_x_u8): Likewise.
(vmovltq_x_u16): Likewise.
(vmvnq_x_s8): Likewise.
(vmvnq_x_s16): Likewise.
(vmvnq_x_s32): Likewise.
(vmvnq_x_u8): Likewise.
(vmvnq_x_u16): Likewise.
(vmvnq_x_u32): Likewise.
(vmvnq_x_n_s16): Likewise.
(vmvnq_x_n_s32): Likewise.
(vmvnq_x_n_u16): Likewise.
(vmvnq_x_n_u32): Likewise.
(vornq_x_s8): Likewise.
(vornq_x_s16): Likewise.
(vornq_x_s32): Likewise.
(vornq_x_u8): Likewise.
(vornq_x_u16): Likewise.
(vornq_x_u32): Likewise.
(vorrq_x_s8): Likewise.
(vorrq_x_s16): Likewise.
(vorrq_x_s32): Likewise.
(vorrq_x_u8): Likewise.
(vorrq_x_u16): Likewise.
(vorrq_x_u32): Likewise.
(vrev16q_x_s8): Likewise.
(vrev16q_x_u8): Likewise.
(vrev32q_x_s8): Likewise.
(vrev32q_x_s16): Likewise.
(vrev32q_x_u8): Likewise.
(vrev32q_x_u16): Likewise.
(vrev64q_x_s8): Likewise.
(vrev64q_x_s16): Likewise.
(vrev64q_x_s32): Likewise.
(vrev64q_x_u8): Likewise.
(vrev64q_x_u16): Likewise.
(vrev64q_x_u32): Likewise.
(vrshlq_x_s8): Likewise.
(vrshlq_x_s16): Likewise.
(vrshlq_x_s32): Likewise.
(vrshlq_x_u8): Likewise.
(vrshlq_x_u16): Likewise.
(vrshlq_x_u32): Likewise.
(vshllbq_x_n_s8): Likewise.
(vshllbq_x_n_s16): Likewise.
(vshllbq_x_n_u8): Likewise.
(vshllbq_x_n_u16): Likewise.
(vshlltq_x_n_s8): Likewise.
(vshlltq_x_n_s16): Likewise.
(vshlltq_x_n_u8): Likewise.
(vshlltq_x_n_u16): Likewise.
(vshlq_x_s8): Likewise.
(vshlq_x_s16): Likewise.
(vshlq_x_s32): Likewise.
(vshlq_x_u8): Likewise.
(vshlq_x_u16): Likewise.
(vshlq_x_u32): Likewise.
(vshlq_x_n_s8): Likewise.
(vshlq_x_n_s16): Likewise.
(vshlq_x_n_s32): Likewise.
(vshlq_x_n_u8): Likewise.
(vshlq_x_n_u16): Likewise.
(vshlq_x_n_u32): Likewise.
(vrshrq_x_n_s8): Likewise.
(vrshrq_x_n_s16): Likewise.
(vrshrq_x_n_s32): Likewise.
(vrshrq_x_n_u8): Likewise.
(vrshrq_x_n_u16): Likewise.
(vrshrq_x_n_u32): Likewise.
(vshrq_x_n_s8): Likewise.
(vshrq_x_n_s16): Likewise.
(vshrq_x_n_s32): Likewise.
(vshrq_x_n_u8): Likewise.
(vshrq_x_n_u16): Likewise.
(vshrq_x_n_u32): Likewise.
(vdupq_x_n_f16): Likewise.
(vdupq_x_n_f32): Likewise.
(vminnmq_x_f16): Likewise.
(vminnmq_x_f32): Likewise.
(vmaxnmq_x_f16): Likewise.
(vmaxnmq_x_f32): Likewise.
(vabdq_x_f16): Likewise.
(vabdq_x_f32): Likewise.
(vabsq_x_f16): Likewise.
(vabsq_x_f32): Likewise.
(vaddq_x_f16): Likewise.
(vaddq_x_f32): Likewise.
(vaddq_x_n_f16): Likewise.
(vaddq_x_n_f32): Likewise.
(vnegq_x_f16): Likewise.
(vnegq_x_f32): Likewise.
(vmulq_x_f16): Likewise.
(vmulq_x_f32): Likewise.
(vmulq_x_n_f16): Likewise.
(vmulq_x_n_f32): Likewise.
(vsubq_x_f16): Likewise.
(vsubq_x_f32): Likewise.
(vsubq_x_n_f16): Likewise.
(vsubq_x_n_f32): Likewise.
(vcaddq_rot90_x_f16): Likewise.
(vcaddq_rot90_x_f32): Likewise.
(vcaddq_rot270_x_f16): Likewise.
(vcaddq_rot270_x_f32): Likewise.
(vcmulq_x_f16): Likewise.
(vcmulq_x_f32): Likewise.
(vcmulq_rot90_x_f16): Likewise.
(vcmulq_rot90_x_f32): Likewise.
(vcmulq_rot180_x_f16): Likewise.
(vcmulq_rot180_x_f32): Likewise.
(vcmulq_rot270_x_f16): Likewise.
(vcmulq_rot270_x_f32): Likewise.
(vcvtaq_x_s16_f16): Likewise.
(vcvtaq_x_s32_f32): Likewise.
(vcvtaq_x_u16_f16): Likewise.
(vcvtaq_x_u32_f32): Likewise.
(vcvtnq_x_s16_f16): Likewise.
(vcvtnq_x_s32_f32): Likewise.
(vcvtnq_x_u16_f16): Likewise.
(vcvtnq_x_u32_f32): Likewise.
(vcvtpq_x_s16_f16): Likewise.
(vcvtpq_x_s32_f32): Likewise.
(vcvtpq_x_u16_f16): Likewise.
(vcvtpq_x_u32_f32): Likewise.
(vcvtmq_x_s16_f16): Likewise.
(vcvtmq_x_s32_f32): Likewise.
(vcvtmq_x_u16_f16): Likewise.
(vcvtmq_x_u32_f32): Likewise.
(vcvtbq_x_f32_f16): Likewise.
(vcvttq_x_f32_f16): Likewise.
(vcvtq_x_f16_u16): Likewise.
(vcvtq_x_f16_s16): Likewise.
(vcvtq_x_f32_s32): Likewise.
(vcvtq_x_f32_u32): Likewise.
(vcvtq_x_n_f16_s16): Likewise.
(vcvtq_x_n_f16_u16): Likewise.
(vcvtq_x_n_f32_s32): Likewise.
(vcvtq_x_n_f32_u32): Likewise.
(vcvtq_x_s16_f16): Likewise.
(vcvtq_x_s32_f32): Likewise.
(vcvtq_x_u16_f16): Likewise.
(vcvtq_x_u32_f32): Likewise.
(vcvtq_x_n_s16_f16): Likewise.
(vcvtq_x_n_s32_f32): Likewise.
(vcvtq_x_n_u16_f16): Likewise.
(vcvtq_x_n_u32_f32): Likewise.
(vrndq_x_f16): Likewise.
(vrndq_x_f32): Likewise.
(vrndnq_x_f16): Likewise.
(vrndnq_x_f32): Likewise.
(vrndmq_x_f16): Likewise.
(vrndmq_x_f32): Likewise.
(vrndpq_x_f16): Likewise.
(vrndpq_x_f32): Likewise.
(vrndaq_x_f16): Likewise.
(vrndaq_x_f32): Likewise.
(vrndxq_x_f16): Likewise.
(vrndxq_x_f32): Likewise.
(vandq_x_f16): Likewise.
(vandq_x_f32): Likewise.
(vbicq_x_f16): Likewise.
(vbicq_x_f32): Likewise.
(vbrsrq_x_n_f16): Likewise.
(vbrsrq_x_n_f32): Likewise.
(veorq_x_f16): Likewise.
(veorq_x_f32): Likewise.
(vornq_x_f16): Likewise.
(vornq_x_f32): Likewise.
(vorrq_x_f16): Likewise.
(vorrq_x_f32): Likewise.
(vrev32q_x_f16): Likewise.
(vrev64q_x_f16): Likewise.
(vrev64q_x_f32): Likewise.
(__arm_vddupq_x_n_u8): Define intrinsic.
(__arm_vddupq_x_n_u16): Likewise.
(__arm_vddupq_x_n_u32): Likewise.
(__arm_vddupq_x_wb_u8): Likewise.
(__arm_vddupq_x_wb_u16): Likewise.
(__arm_vddupq_x_wb_u32): Likewise.
(__arm_vdwdupq_x_n_u8): Likewise.
(__arm_vdwdupq_x_n_u16): Likewise.
(__arm_vdwdupq_x_n_u32): Likewise.
(__arm_vdwdupq_x_wb_u8): Likewise.
(__arm_vdwdupq_x_wb_u16): Likewise.
(__arm_vdwdupq_x_wb_u32): Likewise.
(__arm_vidupq_x_n_u8): Likewise.
(__arm_vidupq_x_n_u16): Likewise.
(__arm_vidupq_x_n_u32): Likewise.
(__arm_vidupq_x_wb_u8): Likewise.
(__arm_vidupq_x_wb_u16): Likewise.
(__arm_vidupq_x_wb_u32): Likewise.
(__arm_viwdupq_x_n_u8): Likewise.
(__arm_viwdupq_x_n_u16): Likewise.
(__arm_viwdupq_x_n_u32): Likewise.
(__arm_viwdupq_x_wb_u8): Likewise.
(__arm_viwdupq_x_wb_u16): Likewise.
(__arm_viwdupq_x_wb_u32): Likewise.
(__arm_vdupq_x_n_s8): Likewise.
(__arm_vdupq_x_n_s16): Likewise.
(__arm_vdupq_x_n_s32): Likewise.
(__arm_vdupq_x_n_u8): Likewise.
(__arm_vdupq_x_n_u16): Likewise.
(__arm_vdupq_x_n_u32): Likewise.
(__arm_vminq_x_s8): Likewise.
(__arm_vminq_x_s16): Likewise.
(__arm_vminq_x_s32): Likewise.
(__arm_vminq_x_u8): Likewise.
(__arm_vminq_x_u16): Likewise.
(__arm_vminq_x_u32): Likewise.
(__arm_vmaxq_x_s8): Likewise.
(__arm_vmaxq_x_s16): Likewise.
(__arm_vmaxq_x_s32): Likewise.
(__arm_vmaxq_x_u8): Likewise.
(__arm_vmaxq_x_u16): Likewise.
(__arm_vmaxq_x_u32): Likewise.
(__arm_vabdq_x_s8): Likewise.
(__arm_vabdq_x_s16): Likewise.
(__arm_vabdq_x_s32): Likewise.
(__arm_vabdq_x_u8): Likewise.
(__arm_vabdq_x_u16): Likewise.
(__arm_vabdq_x_u32): Likewise.
(__arm_vabsq_x_s8): Likewise.
(__arm_vabsq_x_s16): Likewise.
(__arm_vabsq_x_s32): Likewise.
(__arm_vaddq_x_s8): Likewise.
(__arm_vaddq_x_s16): Likewise.
(__arm_vaddq_x_s32): Likewise.
(__arm_vaddq_x_n_s8): Likewise.
(__arm_vaddq_x_n_s16): Likewise.
(__arm_vaddq_x_n_s32): Likewise.
(__arm_vaddq_x_u8): Likewise.
(__arm_vaddq_x_u16): Likewise.
(__arm_vaddq_x_u32): Likewise.
(__arm_vaddq_x_n_u8): Likewise.
(__arm_vaddq_x_n_u16): Likewise.
(__arm_vaddq_x_n_u32): Likewise.
(__arm_vclsq_x_s8): Likewise.
(__arm_vclsq_x_s16): Likewise.
(__arm_vclsq_x_s32): Likewise.
(__arm_vclzq_x_s8): Likewise.
(__arm_vclzq_x_s16): Likewise.
(__arm_vclzq_x_s32): Likewise.
(__arm_vclzq_x_u8): Likewise.
(__arm_vclzq_x_u16): Likewise.
(__arm_vclzq_x_u32): Likewise.
(__arm_vnegq_x_s8): Likewise.
(__arm_vnegq_x_s16): Likewise.
(__arm_vnegq_x_s32): Likewise.
(__arm_vmulhq_x_s8): Likewise.
(__arm_vmulhq_x_s16): Likewise.
(__arm_vmulhq_x_s32): Likewise.
(__arm_vmulhq_x_u8): Likewise.
(__arm_vmulhq_x_u16): Likewise.
(__arm_vmulhq_x_u32): Likewise.
(__arm_vmullbq_poly_x_p8): Likewise.
(__arm_vmullbq_poly_x_p16): Likewise.
(__arm_vmullbq_int_x_s8): Likewise.
(__arm_vmullbq_int_x_s16): Likewise.
(__arm_vmullbq_int_x_s32): Likewise.
(__arm_vmullbq_int_x_u8): Likewise.
(__arm_vmullbq_int_x_u16): Likewise.
(__arm_vmullbq_int_x_u32): Likewise.
(__arm_vmulltq_poly_x_p8): Likewise.
(__arm_vmulltq_poly_x_p16): Likewise.
(__arm_vmulltq_int_x_s8): Likewise.
(__arm_vmulltq_int_x_s16): Likewise.
(__arm_vmulltq_int_x_s32): Likewise.
(__arm_vmulltq_int_x_u8): Likewise.
(__arm_vmulltq_int_x_u16): Likewise.
(__arm_vmulltq_int_x_u32): Likewise.
(__arm_vmulq_x_s8): Likewise.
(__arm_vmulq_x_s16): Likewise.
(__arm_vmulq_x_s32): Likewise.
(__arm_vmulq_x_n_s8): Likewise.
(__arm_vmulq_x_n_s16): Likewise.
(__arm_vmulq_x_n_s32): Likewise.
(__arm_vmulq_x_u8): Likewise.
(__arm_vmulq_x_u16): Likewise.
(__arm_vmulq_x_u32): Likewise.
(__arm_vmulq_x_n_u8): Likewise.
(__arm_vmulq_x_n_u16): Likewise.
(__arm_vmulq_x_n_u32): Likewise.
(__arm_vsubq_x_s8): Likewise.
(__arm_vsubq_x_s16): Likewise.
(__arm_vsubq_x_s32): Likewise.
(__arm_vsubq_x_n_s8): Likewise.
(__arm_vsubq_x_n_s16): Likewise.
(__arm_vsubq_x_n_s32): Likewise.
(__arm_vsubq_x_u8): Likewise.
(__arm_vsubq_x_u16): Likewise.
(__arm_vsubq_x_u32): Likewise.
(__arm_vsubq_x_n_u8): Likewise.
(__arm_vsubq_x_n_u16): Likewise.
(__arm_vsubq_x_n_u32): Likewise.
(__arm_vcaddq_rot90_x_s8): Likewise.
(__arm_vcaddq_rot90_x_s16): Likewise.
(__arm_vcaddq_rot90_x_s32): Likewise.
(__arm_vcaddq_rot90_x_u8): Likewise.
(__arm_vcaddq_rot90_x_u16): Likewise.
(__arm_vcaddq_rot90_x_u32): Likewise.
(__arm_vcaddq_rot270_x_s8): Likewise.
(__arm_vcaddq_rot270_x_s16): Likewise.
(__arm_vcaddq_rot270_x_s32): Likewise.
(__arm_vcaddq_rot270_x_u8): Likewise.
(__arm_vcaddq_rot270_x_u16): Likewise.
(__arm_vcaddq_rot270_x_u32): Likewise.
(__arm_vhaddq_x_n_s8): Likewise.
(__arm_vhaddq_x_n_s16): Likewise.
(__arm_vhaddq_x_n_s32): Likewise.
(__arm_vhaddq_x_n_u8): Likewise.
(__arm_vhaddq_x_n_u16): Likewise.
(__arm_vhaddq_x_n_u32): Likewise.
(__arm_vhaddq_x_s8): Likewise.
(__arm_vhaddq_x_s16): Likewise.
(__arm_vhaddq_x_s32): Likewise.
(__arm_vhaddq_x_u8): Likewise.
(__arm_vhaddq_x_u16): Likewise.
(__arm_vhaddq_x_u32): Likewise.
(__arm_vhcaddq_rot90_x_s8): Likewise.
(__arm_vhcaddq_rot90_x_s16): Likewise.
(__arm_vhcaddq_rot90_x_s32): Likewise.
(__arm_vhcaddq_rot270_x_s8): Likewise.
(__arm_vhcaddq_rot270_x_s16): Likewise.
(__arm_vhcaddq_rot270_x_s32): Likewise.
(__arm_vhsubq_x_n_s8): Likewise.
(__arm_vhsubq_x_n_s16): Likewise.
(__arm_vhsubq_x_n_s32): Likewise.
(__arm_vhsubq_x_n_u8): Likewise.
(__arm_vhsubq_x_n_u16): Likewise.
(__arm_vhsubq_x_n_u32): Likewise.
(__arm_vhsubq_x_s8): Likewise.
(__arm_vhsubq_x_s16): Likewise.
(__arm_vhsubq_x_s32): Likewise.
(__arm_vhsubq_x_u8): Likewise.
(__arm_vhsubq_x_u16): Likewise.
(__arm_vhsubq_x_u32): Likewise.
(__arm_vrhaddq_x_s8): Likewise.
(__arm_vrhaddq_x_s16): Likewise.
(__arm_vrhaddq_x_s32): Likewise.
(__arm_vrhaddq_x_u8): Likewise.
(__arm_vrhaddq_x_u16): Likewise.
(__arm_vrhaddq_x_u32): Likewise.
(__arm_vrmulhq_x_s8): Likewise.
(__arm_vrmulhq_x_s16): Likewise.
(__arm_vrmulhq_x_s32): Likewise.
(__arm_vrmulhq_x_u8): Likewise.
(__arm_vrmulhq_x_u16): Likewise.
(__arm_vrmulhq_x_u32): Likewise.
(__arm_vandq_x_s8): Likewise.
(__arm_vandq_x_s16): Likewise.
(__arm_vandq_x_s32): Likewise.
(__arm_vandq_x_u8): Likewise.
(__arm_vandq_x_u16): Likewise.
(__arm_vandq_x_u32): Likewise.
(__arm_vbicq_x_s8): Likewise.
(__arm_vbicq_x_s16): Likewise.
(__arm_vbicq_x_s32): Likewise.
(__arm_vbicq_x_u8): Likewise.
(__arm_vbicq_x_u16): Likewise.
(__arm_vbicq_x_u32): Likewise.
(__arm_vbrsrq_x_n_s8): Likewise.
(__arm_vbrsrq_x_n_s16): Likewise.
(__arm_vbrsrq_x_n_s32): Likewise.
(__arm_vbrsrq_x_n_u8): Likewise.
(__arm_vbrsrq_x_n_u16): Likewise.
(__arm_vbrsrq_x_n_u32): Likewise.
(__arm_veorq_x_s8): Likewise.
(__arm_veorq_x_s16): Likewise.
(__arm_veorq_x_s32): Likewise.
(__arm_veorq_x_u8): Likewise.
(__arm_veorq_x_u16): Likewise.
(__arm_veorq_x_u32): Likewise.
(__arm_vmovlbq_x_s8): Likewise.
(__arm_vmovlbq_x_s16): Likewise.
(__arm_vmovlbq_x_u8): Likewise.
(__arm_vmovlbq_x_u16): Likewise.
(__arm_vmovltq_x_s8): Likewise.
(__arm_vmovltq_x_s16): Likewise.
(__arm_vmovltq_x_u8): Likewise.
(__arm_vmovltq_x_u16): Likewise.
(__arm_vmvnq_x_s8): Likewise.
(__arm_vmvnq_x_s16): Likewise.
(__arm_vmvnq_x_s32): Likewise.
(__arm_vmvnq_x_u8): Likewise.
(__arm_vmvnq_x_u16): Likewise.
(__arm_vmvnq_x_u32): Likewise.
(__arm_vmvnq_x_n_s16): Likewise.
(__arm_vmvnq_x_n_s32): Likewise.
(__arm_vmvnq_x_n_u16): Likewise.
(__arm_vmvnq_x_n_u32): Likewise.
(__arm_vornq_x_s8): Likewise.
(__arm_vornq_x_s16): Likewise.
(__arm_vornq_x_s32): Likewise.
(__arm_vornq_x_u8): Likewise.
(__arm_vornq_x_u16): Likewise.
(__arm_vornq_x_u32): Likewise.
(__arm_vorrq_x_s8): Likewise.
(__arm_vorrq_x_s16): Likewise.
(__arm_vorrq_x_s32): Likewise.
(__arm_vorrq_x_u8): Likewise.
(__arm_vorrq_x_u16): Likewise.
(__arm_vorrq_x_u32): Likewise.
(__arm_vrev16q_x_s8): Likewise.
(__arm_vrev16q_x_u8): Likewise.
(__arm_vrev32q_x_s8): Likewise.
(__arm_vrev32q_x_s16): Likewise.
(__arm_vrev32q_x_u8): Likewise.
(__arm_vrev32q_x_u16): Likewise.
(__arm_vrev64q_x_s8): Likewise.
(__arm_vrev64q_x_s16): Likewise.
(__arm_vrev64q_x_s32): Likewise.
(__arm_vrev64q_x_u8): Likewise.
(__arm_vrev64q_x_u16): Likewise.
(__arm_vrev64q_x_u32): Likewise.
(__arm_vrshlq_x_s8): Likewise.
(__arm_vrshlq_x_s16): Likewise.
(__arm_vrshlq_x_s32): Likewise.
(__arm_vrshlq_x_u8): Likewise.
(__arm_vrshlq_x_u16): Likewise.
(__arm_vrshlq_x_u32): Likewise.
(__arm_vshllbq_x_n_s8): Likewise.
(__arm_vshllbq_x_n_s16): Likewise.
(__arm_vshllbq_x_n_u8): Likewise.
(__arm_vshllbq_x_n_u16): Likewise.
(__arm_vshlltq_x_n_s8): Likewise.
(__arm_vshlltq_x_n_s16): Likewise.
(__arm_vshlltq_x_n_u8): Likewise.
(__arm_vshlltq_x_n_u16): Likewise.
(__arm_vshlq_x_s8): Likewise.
(__arm_vshlq_x_s16): Likewise.
(__arm_vshlq_x_s32): Likewise.
(__arm_vshlq_x_u8): Likewise.
(__arm_vshlq_x_u16): Likewise.
(__arm_vshlq_x_u32): Likewise.
(__arm_vshlq_x_n_s8): Likewise.
(__arm_vshlq_x_n_s16): Likewise.
(__arm_vshlq_x_n_s32): Likewise.
(__arm_vshlq_x_n_u8): Likewise.
(__arm_vshlq_x_n_u16): Likewise.
(__arm_vshlq_x_n_u32): Likewise.
(__arm_vrshrq_x_n_s8): Likewise.
(__arm_vrshrq_x_n_s16): Likewise.
(__arm_vrshrq_x_n_s32): Likewise.
(__arm_vrshrq_x_n_u8): Likewise.
(__arm_vrshrq_x_n_u16): Likewise.
(__arm_vrshrq_x_n_u32): Likewise.
(__arm_vshrq_x_n_s8): Likewise.
(__arm_vshrq_x_n_s16): Likewise.
(__arm_vshrq_x_n_s32): Likewise.
(__arm_vshrq_x_n_u8): Likewise.
(__arm_vshrq_x_n_u16): Likewise.
(__arm_vshrq_x_n_u32): Likewise.
(__arm_vdupq_x_n_f16): Likewise.
(__arm_vdupq_x_n_f32): Likewise.
(__arm_vminnmq_x_f16): Likewise.
(__arm_vminnmq_x_f32): Likewise.
(__arm_vmaxnmq_x_f16): Likewise.
(__arm_vmaxnmq_x_f32): Likewise.
(__arm_vabdq_x_f16): Likewise.
(__arm_vabdq_x_f32): Likewise.
(__arm_vabsq_x_f16): Likewise.
(__arm_vabsq_x_f32): Likewise.
(__arm_vaddq_x_f16): Likewise.
(__arm_vaddq_x_f32): Likewise.
(__arm_vaddq_x_n_f16): Likewise.
(__arm_vaddq_x_n_f32): Likewise.
(__arm_vnegq_x_f16): Likewise.
(__arm_vnegq_x_f32): Likewise.
(__arm_vmulq_x_f16): Likewise.
(__arm_vmulq_x_f32): Likewise.
(__arm_vmulq_x_n_f16): Likewise.
(__arm_vmulq_x_n_f32): Likewise.
(__arm_vsubq_x_f16): Likewise.
(__arm_vsubq_x_f32): Likewise.
(__arm_vsubq_x_n_f16): Likewise.
(__arm_vsubq_x_n_f32): Likewise.
(__arm_vcaddq_rot90_x_f16): Likewise.
(__arm_vcaddq_rot90_x_f32): Likewise.
(__arm_vcaddq_rot270_x_f16): Likewise.
(__arm_vcaddq_rot270_x_f32): Likewise.
(__arm_vcmulq_x_f16): Likewise.
(__arm_vcmulq_x_f32): Likewise.
(__arm_vcmulq_rot90_x_f16): Likewise.
(__arm_vcmulq_rot90_x_f32): Likewise.
(__arm_vcmulq_rot180_x_f16): Likewise.
(__arm_vcmulq_rot180_x_f32): Likewise.
(__arm_vcmulq_rot270_x_f16): Likewise.
(__arm_vcmulq_rot270_x_f32): Likewise.
(__arm_vcvtaq_x_s16_f16): Likewise.
(__arm_vcvtaq_x_s32_f32): Likewise.
(__arm_vcvtaq_x_u16_f16): Likewise.
(__arm_vcvtaq_x_u32_f32): Likewise.
(__arm_vcvtnq_x_s16_f16): Likewise.
(__arm_vcvtnq_x_s32_f32): Likewise.
(__arm_vcvtnq_x_u16_f16): Likewise.
(__arm_vcvtnq_x_u32_f32): Likewise.
(__arm_vcvtpq_x_s16_f16): Likewise.
(__arm_vcvtpq_x_s32_f32): Likewise.
(__arm_vcvtpq_x_u16_f16): Likewise.
(__arm_vcvtpq_x_u32_f32): Likewise.
(__arm_vcvtmq_x_s16_f16): Likewise.
(__arm_vcvtmq_x_s32_f32): Likewise.
(__arm_vcvtmq_x_u16_f16): Likewise.
(__arm_vcvtmq_x_u32_f32): Likewise.
(__arm_vcvtbq_x_f32_f16): Likewise.
(__arm_vcvttq_x_f32_f16): Likewise.
(__arm_vcvtq_x_f16_u16): Likewise.
(__arm_vcvtq_x_f16_s16): Likewise.
(__arm_vcvtq_x_f32_s32): Likewise.
(__arm_vcvtq_x_f32_u32): Likewise.
(__arm_vcvtq_x_n_f16_s16): Likewise.
(__arm_vcvtq_x_n_f16_u16): Likewise.
(__arm_vcvtq_x_n_f32_s32): Likewise.
(__arm_vcvtq_x_n_f32_u32): Likewise.
(__arm_vcvtq_x_s16_f16): Likewise.
(__arm_vcvtq_x_s32_f32): Likewise.
(__arm_vcvtq_x_u16_f16): Likewise.
(__arm_vcvtq_x_u32_f32): Likewise.
(__arm_vcvtq_x_n_s16_f16): Likewise.
(__arm_vcvtq_x_n_s32_f32): Likewise.
(__arm_vcvtq_x_n_u16_f16): Likewise.
(__arm_vcvtq_x_n_u32_f32): Likewise.
(__arm_vrndq_x_f16): Likewise.
(__arm_vrndq_x_f32): Likewise.
(__arm_vrndnq_x_f16): Likewise.
(__arm_vrndnq_x_f32): Likewise.
(__arm_vrndmq_x_f16): Likewise.
(__arm_vrndmq_x_f32): Likewise.
(__arm_vrndpq_x_f16): Likewise.
(__arm_vrndpq_x_f32): Likewise.
(__arm_vrndaq_x_f16): Likewise.
(__arm_vrndaq_x_f32): Likewise.
(__arm_vrndxq_x_f16): Likewise.
(__arm_vrndxq_x_f32): Likewise.
(__arm_vandq_x_f16): Likewise.
(__arm_vandq_x_f32): Likewise.
(__arm_vbicq_x_f16): Likewise.
(__arm_vbicq_x_f32): Likewise.
(__arm_vbrsrq_x_n_f16): Likewise.
(__arm_vbrsrq_x_n_f32): Likewise.
(__arm_veorq_x_f16): Likewise.
(__arm_veorq_x_f32): Likewise.
(__arm_vornq_x_f16): Likewise.
(__arm_vornq_x_f32): Likewise.
(__arm_vorrq_x_f16): Likewise.
(__arm_vorrq_x_f32): Likewise.
(__arm_vrev32q_x_f16): Likewise.
(__arm_vrev64q_x_f16): Likewise.
(__arm_vrev64q_x_f32): Likewise.
(vabdq_x): Define polymorphic variant.
(vabsq_x): Likewise.
(vaddq_x): Likewise.
(vandq_x): Likewise.
(vbicq_x): Likewise.
(vbrsrq_x): Likewise.
(vcaddq_rot270_x): Likewise.
(vcaddq_rot90_x): Likewise.
(vcmulq_rot180_x): Likewise.
(vcmulq_rot270_x): Likewise.
(vcmulq_x): Likewise.
(vcvtq_x): Likewise.
(vcvtq_x_n): Likewise.
(vcvtnq_m): Likewise.
(veorq_x): Likewise.
(vmaxnmq_x): Likewise.
(vminnmq_x): Likewise.
(vmulq_x): Likewise.
(vnegq_x): Likewise.
(vornq_x): Likewise.
(vorrq_x): Likewise.
(vrev32q_x): Likewise.
(vrev64q_x): Likewise.
(vrndaq_x): Likewise.
(vrndmq_x): Likewise.
(vrndnq_x): Likewise.
(vrndpq_x): Likewise.
(vrndq_x): Likewise.
(vrndxq_x): Likewise.
(vsubq_x): Likewise.
(vcmulq_rot90_x): Likewise.
(vadciq): Likewise.
(vclsq_x): Likewise.
(vclzq_x): Likewise.
(vhaddq_x): Likewise.
(vhcaddq_rot270_x): Likewise.
(vhcaddq_rot90_x): Likewise.
(vhsubq_x): Likewise.
(vmaxq_x): Likewise.
(vminq_x): Likewise.
(vmovlbq_x): Likewise.
(vmovltq_x): Likewise.
(vmulhq_x): Likewise.
(vmullbq_int_x): Likewise.
(vmullbq_poly_x): Likewise.
(vmulltq_int_x): Likewise.
(vmulltq_poly_x): Likewise.
(vmvnq_x): Likewise.
(vrev16q_x): Likewise.
(vrhaddq_x): Likewise.
(vrmulhq_x): Likewise.
(vrshlq_x): Likewise.
(vrshrq_x): Likewise.
(vshllbq_x): Likewise.
(vshlltq_x): Likewise.
(vshlq_x_n): Likewise.
(vshlq_x): Likewise.
(vdwdupq_x_u8): Likewise.
(vdwdupq_x_u16): Likewise.
(vdwdupq_x_u32): Likewise.
(viwdupq_x_u8): Likewise.
(viwdupq_x_u16): Likewise.
(viwdupq_x_u32): Likewise.
(vidupq_x_u8): Likewise.
(vddupq_x_u8): Likewise.
(vidupq_x_u16): Likewise.
(vddupq_x_u16): Likewise.
(vidupq_x_u32): Likewise.
(vddupq_x_u32): Likewise.
(vshrq_x): Likewise.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_x_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_x_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_x_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_x_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_x_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_x_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_x_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_x_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_x_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_x_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_x_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_x_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_x_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_x_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_x_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_x_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_x_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_x_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_n_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_x_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_x_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_x_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_x_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_x_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_x_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_x_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_x_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_x_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_x_u8.c: Likewise.
Richard Biener [Fri, 20 Mar 2020 14:00:11 +0000 (15:00 +0100)]
fix CTOR vectorization
We failed to handle pattern stmts appropriately.
2020-03-20 Richard Biener <rguenther@suse.de>
* tree-vect-slp.c (vect_analyze_slp_instance): Push the stmts
to vectorize for CTOR defs.
Srinath Parvathaneni [Fri, 20 Mar 2020 12:06:26 +0000 (12:06 +0000)]
[ARM][GCC][2/8x]: MVE ACLE gather load and scatter store intrinsics with writeback.
This patch supports following MVE ACLE intrinsics with writeback.
vldrdq_gather_base_wb_s64, vldrdq_gather_base_wb_u64, vldrdq_gather_base_wb_z_s64,
vldrdq_gather_base_wb_z_u64, vldrwq_gather_base_wb_f32, vldrwq_gather_base_wb_s32,
vldrwq_gather_base_wb_u32, vldrwq_gather_base_wb_z_f32, vldrwq_gather_base_wb_z_s32,
vldrwq_gather_base_wb_z_u32, vstrdq_scatter_base_wb_p_s64, vstrdq_scatter_base_wb_p_u64,
vstrdq_scatter_base_wb_s64, vstrdq_scatter_base_wb_u64, vstrwq_scatter_base_wb_p_s32,
vstrwq_scatter_base_wb_p_f32, vstrwq_scatter_base_wb_p_u32, vstrwq_scatter_base_wb_s32,
vstrwq_scatter_base_wb_u32, vstrwq_scatter_base_wb_f32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/arm-builtins.c (LDRGBWBS_QUALIFIERS): Define builtin
qualifier.
(LDRGBWBU_QUALIFIERS): Likewise.
(LDRGBWBS_Z_QUALIFIERS): Likewise.
(LDRGBWBU_Z_QUALIFIERS): Likewise.
(STRSBWBS_QUALIFIERS): Likewise.
(STRSBWBU_QUALIFIERS): Likewise.
(STRSBWBS_P_QUALIFIERS): Likewise.
(STRSBWBU_P_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vldrdq_gather_base_wb_s64): Define macro.
(vldrdq_gather_base_wb_u64): Likewise.
(vldrdq_gather_base_wb_z_s64): Likewise.
(vldrdq_gather_base_wb_z_u64): Likewise.
(vldrwq_gather_base_wb_f32): Likewise.
(vldrwq_gather_base_wb_s32): Likewise.
(vldrwq_gather_base_wb_u32): Likewise.
(vldrwq_gather_base_wb_z_f32): Likewise.
(vldrwq_gather_base_wb_z_s32): Likewise.
(vldrwq_gather_base_wb_z_u32): Likewise.
(vstrdq_scatter_base_wb_p_s64): Likewise.
(vstrdq_scatter_base_wb_p_u64): Likewise.
(vstrdq_scatter_base_wb_s64): Likewise.
(vstrdq_scatter_base_wb_u64): Likewise.
(vstrwq_scatter_base_wb_p_s32): Likewise.
(vstrwq_scatter_base_wb_p_f32): Likewise.
(vstrwq_scatter_base_wb_p_u32): Likewise.
(vstrwq_scatter_base_wb_s32): Likewise.
(vstrwq_scatter_base_wb_u32): Likewise.
(vstrwq_scatter_base_wb_f32): Likewise.
(__arm_vldrdq_gather_base_wb_s64): Define intrinsic.
(__arm_vldrdq_gather_base_wb_u64): Likewise.
(__arm_vldrdq_gather_base_wb_z_s64): Likewise.
(__arm_vldrdq_gather_base_wb_z_u64): Likewise.
(__arm_vldrwq_gather_base_wb_s32): Likewise.
(__arm_vldrwq_gather_base_wb_u32): Likewise.
(__arm_vldrwq_gather_base_wb_z_s32): Likewise.
(__arm_vldrwq_gather_base_wb_z_u32): Likewise.
(__arm_vstrdq_scatter_base_wb_s64): Likewise.
(__arm_vstrdq_scatter_base_wb_u64): Likewise.
(__arm_vstrdq_scatter_base_wb_p_s64): Likewise.
(__arm_vstrdq_scatter_base_wb_p_u64): Likewise.
(__arm_vstrwq_scatter_base_wb_p_s32): Likewise.
(__arm_vstrwq_scatter_base_wb_p_u32): Likewise.
(__arm_vstrwq_scatter_base_wb_s32): Likewise.
(__arm_vstrwq_scatter_base_wb_u32): Likewise.
(__arm_vldrwq_gather_base_wb_f32): Likewise.
(__arm_vldrwq_gather_base_wb_z_f32): Likewise.
(__arm_vstrwq_scatter_base_wb_f32): Likewise.
(__arm_vstrwq_scatter_base_wb_p_f32): Likewise.
(vstrwq_scatter_base_wb): Define polymorphic variant.
(vstrwq_scatter_base_wb_p): Likewise.
(vstrdq_scatter_base_wb_p): Likewise.
(vstrdq_scatter_base_wb): Likewise.
* config/arm/arm_mve_builtins.def (LDRGBWBS_QUALIFIERS): Use builtin
qualifier.
* config/arm/mve.md (mve_vstrwq_scatter_base_wb_<supf>v4si): Define RTL
pattern.
(mve_vstrwq_scatter_base_wb_add_<supf>v4si): Likewise.
(mve_vstrwq_scatter_base_wb_<supf>v4si_insn): Likewise.
(mve_vstrwq_scatter_base_wb_p_<supf>v4si): Likewise.
(mve_vstrwq_scatter_base_wb_p_add_<supf>v4si): Likewise.
(mve_vstrwq_scatter_base_wb_p_<supf>v4si_insn): Likewise.
(mve_vstrwq_scatter_base_wb_fv4sf): Likewise.
(mve_vstrwq_scatter_base_wb_add_fv4sf): Likewise.
(mve_vstrwq_scatter_base_wb_fv4sf_insn): Likewise.
(mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.
(mve_vstrwq_scatter_base_wb_p_add_fv4sf): Likewise.
(mve_vstrwq_scatter_base_wb_p_fv4sf_insn): Likewise.
(mve_vstrdq_scatter_base_wb_<supf>v2di): Likewise.
(mve_vstrdq_scatter_base_wb_add_<supf>v2di): Likewise.
(mve_vstrdq_scatter_base_wb_<supf>v2di_insn): Likewise.
(mve_vstrdq_scatter_base_wb_p_<supf>v2di): Likewise.
(mve_vstrdq_scatter_base_wb_p_add_<supf>v2di): Likewise.
(mve_vstrdq_scatter_base_wb_p_<supf>v2di_insn): Likewise.
(mve_vldrwq_gather_base_wb_<supf>v4si): Likewise.
(mve_vldrwq_gather_base_wb_<supf>v4si_insn): Likewise.
(mve_vldrwq_gather_base_wb_z_<supf>v4si): Likewise.
(mve_vldrwq_gather_base_wb_z_<supf>v4si_insn): Likewise.
(mve_vldrwq_gather_base_wb_fv4sf): Likewise.
(mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.
(mve_vldrwq_gather_base_wb_z_fv4sf): Likewise.
(mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.
(mve_vldrdq_gather_base_wb_<supf>v2di): Likewise.
(mve_vldrdq_gather_base_wb_<supf>v2di_insn): Likewise.
(mve_vldrdq_gather_base_wb_z_<supf>v2di): Likewise.
(mve_vldrdq_gather_base_wb_z_<supf>v2di_insn): Likewise.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_s64.c: New test.
* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrdq_gather_base_wb_z_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_wb_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_s64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_p_u64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_wb_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_f32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_p_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_wb_u32.c: Likewise.
Srinath Parvathaneni [Fri, 20 Mar 2020 11:58:30 +0000 (11:58 +0000)]
[ARM][GCC][1/8x]: MVE ACLE vidup, vddup, viwdup and vdwdup intrinsics with writeback.
This patch supports following MVE ACLE intrinsics with writeback.
vddupq_m_n_u8, vddupq_m_n_u32, vddupq_m_n_u16, vddupq_m_wb_u8, vddupq_m_wb_u16, vddupq_m_wb_u32, vddupq_n_u8, vddupq_n_u32, vddupq_n_u16, vddupq_wb_u8, vddupq_wb_u16, vddupq_wb_u32, vdwdupq_m_n_u8, vdwdupq_m_n_u32, vdwdupq_m_n_u16, vdwdupq_m_wb_u8, vdwdupq_m_wb_u32, vdwdupq_m_wb_u16, vdwdupq_n_u8, vdwdupq_n_u32, vdwdupq_n_u16, vdwdupq_wb_u8, vdwdupq_wb_u32, vdwdupq_wb_u16, vidupq_m_n_u8, vidupq_m_n_u32, vidupq_m_n_u16, vidupq_m_wb_u8, vidupq_m_wb_u16, vidupq_m_wb_u32, vidupq_n_u8, vidupq_n_u32, vidupq_n_u16, vidupq_wb_u8, vidupq_wb_u16, vidupq_wb_u32, viwdupq_m_n_u8, viwdupq_m_n_u32, viwdupq_m_n_u16, viwdupq_m_wb_u8, viwdupq_m_wb_u32, viwdupq_m_wb_u16, viwdupq_n_u8, viwdupq_n_u32, viwdupq_n_u16, viwdupq_wb_u8, viwdupq_wb_u32, viwdupq_wb_u16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/arm-builtins.c
(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Define quinary
builtin qualifier.
* config/arm/arm_mve.h (vddupq_m_n_u8): Define macro.
(vddupq_m_n_u32): Likewise.
(vddupq_m_n_u16): Likewise.
(vddupq_m_wb_u8): Likewise.
(vddupq_m_wb_u16): Likewise.
(vddupq_m_wb_u32): Likewise.
(vddupq_n_u8): Likewise.
(vddupq_n_u32): Likewise.
(vddupq_n_u16): Likewise.
(vddupq_wb_u8): Likewise.
(vddupq_wb_u16): Likewise.
(vddupq_wb_u32): Likewise.
(vdwdupq_m_n_u8): Likewise.
(vdwdupq_m_n_u32): Likewise.
(vdwdupq_m_n_u16): Likewise.
(vdwdupq_m_wb_u8): Likewise.
(vdwdupq_m_wb_u32): Likewise.
(vdwdupq_m_wb_u16): Likewise.
(vdwdupq_n_u8): Likewise.
(vdwdupq_n_u32): Likewise.
(vdwdupq_n_u16): Likewise.
(vdwdupq_wb_u8): Likewise.
(vdwdupq_wb_u32): Likewise.
(vdwdupq_wb_u16): Likewise.
(vidupq_m_n_u8): Likewise.
(vidupq_m_n_u32): Likewise.
(vidupq_m_n_u16): Likewise.
(vidupq_m_wb_u8): Likewise.
(vidupq_m_wb_u16): Likewise.
(vidupq_m_wb_u32): Likewise.
(vidupq_n_u8): Likewise.
(vidupq_n_u32): Likewise.
(vidupq_n_u16): Likewise.
(vidupq_wb_u8): Likewise.
(vidupq_wb_u16): Likewise.
(vidupq_wb_u32): Likewise.
(viwdupq_m_n_u8): Likewise.
(viwdupq_m_n_u32): Likewise.
(viwdupq_m_n_u16): Likewise.
(viwdupq_m_wb_u8): Likewise.
(viwdupq_m_wb_u32): Likewise.
(viwdupq_m_wb_u16): Likewise.
(viwdupq_n_u8): Likewise.
(viwdupq_n_u32): Likewise.
(viwdupq_n_u16): Likewise.
(viwdupq_wb_u8): Likewise.
(viwdupq_wb_u32): Likewise.
(viwdupq_wb_u16): Likewise.
(__arm_vddupq_m_n_u8): Define intrinsic.
(__arm_vddupq_m_n_u32): Likewise.
(__arm_vddupq_m_n_u16): Likewise.
(__arm_vddupq_m_wb_u8): Likewise.
(__arm_vddupq_m_wb_u16): Likewise.
(__arm_vddupq_m_wb_u32): Likewise.
(__arm_vddupq_n_u8): Likewise.
(__arm_vddupq_n_u32): Likewise.
(__arm_vddupq_n_u16): Likewise.
(__arm_vdwdupq_m_n_u8): Likewise.
(__arm_vdwdupq_m_n_u32): Likewise.
(__arm_vdwdupq_m_n_u16): Likewise.
(__arm_vdwdupq_m_wb_u8): Likewise.
(__arm_vdwdupq_m_wb_u32): Likewise.
(__arm_vdwdupq_m_wb_u16): Likewise.
(__arm_vdwdupq_n_u8): Likewise.
(__arm_vdwdupq_n_u32): Likewise.
(__arm_vdwdupq_n_u16): Likewise.
(__arm_vdwdupq_wb_u8): Likewise.
(__arm_vdwdupq_wb_u32): Likewise.
(__arm_vdwdupq_wb_u16): Likewise.
(__arm_vidupq_m_n_u8): Likewise.
(__arm_vidupq_m_n_u32): Likewise.
(__arm_vidupq_m_n_u16): Likewise.
(__arm_vidupq_n_u8): Likewise.
(__arm_vidupq_m_wb_u8): Likewise.
(__arm_vidupq_m_wb_u16): Likewise.
(__arm_vidupq_m_wb_u32): Likewise.
(__arm_vidupq_n_u32): Likewise.
(__arm_vidupq_n_u16): Likewise.
(__arm_vidupq_wb_u8): Likewise.
(__arm_vidupq_wb_u16): Likewise.
(__arm_vidupq_wb_u32): Likewise.
(__arm_vddupq_wb_u8): Likewise.
(__arm_vddupq_wb_u16): Likewise.
(__arm_vddupq_wb_u32): Likewise.
(__arm_viwdupq_m_n_u8): Likewise.
(__arm_viwdupq_m_n_u32): Likewise.
(__arm_viwdupq_m_n_u16): Likewise.
(__arm_viwdupq_m_wb_u8): Likewise.
(__arm_viwdupq_m_wb_u32): Likewise.
(__arm_viwdupq_m_wb_u16): Likewise.
(__arm_viwdupq_n_u8): Likewise.
(__arm_viwdupq_n_u32): Likewise.
(__arm_viwdupq_n_u16): Likewise.
(__arm_viwdupq_wb_u8): Likewise.
(__arm_viwdupq_wb_u32): Likewise.
(__arm_viwdupq_wb_u16): Likewise.
(vidupq_m): Define polymorphic variant.
(vddupq_m): Likewise.
(vidupq_u16): Likewise.
(vidupq_u32): Likewise.
(vidupq_u8): Likewise.
(vddupq_u16): Likewise.
(vddupq_u32): Likewise.
(vddupq_u8): Likewise.
(viwdupq_m): Likewise.
(viwdupq_u16): Likewise.
(viwdupq_u32): Likewise.
(viwdupq_u8): Likewise.
(vdwdupq_m): Likewise.
(vdwdupq_u16): Likewise.
(vdwdupq_u32): Likewise.
(vdwdupq_u8): Likewise.
* config/arm/arm_mve_builtins.def
(QUINOP_UNONE_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Use builtin
qualifier.
* config/arm/mve.md (mve_vidupq_n_u<mode>): Define RTL pattern.
(mve_vidupq_u<mode>_insn): Likewise.
(mve_vidupq_m_n_u<mode>): Likewise.
(mve_vidupq_m_wb_u<mode>_insn): Likewise.
(mve_vddupq_n_u<mode>): Likewise.
(mve_vddupq_u<mode>_insn): Likewise.
(mve_vddupq_m_n_u<mode>): Likewise.
(mve_vddupq_m_wb_u<mode>_insn): Likewise.
(mve_vdwdupq_n_u<mode>): Likewise.
(mve_vdwdupq_wb_u<mode>): Likewise.
(mve_vdwdupq_wb_u<mode>_insn): Likewise.
(mve_vdwdupq_m_n_u<mode>): Likewise.
(mve_vdwdupq_m_wb_u<mode>): Likewise.
(mve_vdwdupq_m_wb_u<mode>_insn): Likewise.
(mve_viwdupq_n_u<mode>): Likewise.
(mve_viwdupq_wb_u<mode>): Likewise.
(mve_viwdupq_wb_u<mode>_insn): Likewise.
(mve_viwdupq_m_n_u<mode>): Likewise.
(mve_viwdupq_m_wb_u<mode>): Likewise.
(mve_viwdupq_m_wb_u<mode>_insn): Likewise.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/mve/intrinsics/vddupq_m_n_u16.c: New test.
* gcc.target/arm/mve/intrinsics/vddupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_m_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vddupq_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_m_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdwdupq_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_m_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vidupq_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_m_wb_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_wb_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_wb_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/viwdupq_wb_u8.c: Likewise.
Srinath Parvathaneni [Fri, 20 Mar 2020 11:50:21 +0000 (11:50 +0000)]
[ARM][GCC][7x]: MVE vreinterpretq and vuninitializedq intrinsics.
This patch supports following MVE ACLE intrinsics.
vreinterpretq_s16_s32, vreinterpretq_s16_s64, vreinterpretq_s16_s8, vreinterpretq_s16_u16,
vreinterpretq_s16_u32, vreinterpretq_s16_u64, vreinterpretq_s16_u8, vreinterpretq_s32_s16,
vreinterpretq_s32_s64, vreinterpretq_s32_s8, vreinterpretq_s32_u16, vreinterpretq_s32_u32,
vreinterpretq_s32_u64, vreinterpretq_s32_u8, vreinterpretq_s64_s16, vreinterpretq_s64_s32,
vreinterpretq_s64_s8, vreinterpretq_s64_u16, vreinterpretq_s64_u32, vreinterpretq_s64_u64,
vreinterpretq_s64_u8, vreinterpretq_s8_s16, vreinterpretq_s8_s32, vreinterpretq_s8_s64,
vreinterpretq_s8_u16, vreinterpretq_s8_u32, vreinterpretq_s8_u64, vreinterpretq_s8_u8,
vreinterpretq_u16_s16, vreinterpretq_u16_s32, vreinterpretq_u16_s64, vreinterpretq_u16_s8,
vreinterpretq_u16_u32, vreinterpretq_u16_u64, vreinterpretq_u16_u8, vreinterpretq_u32_s16,
vreinterpretq_u32_s32, vreinterpretq_u32_s64, vreinterpretq_u32_s8, vreinterpretq_u32_u16,
vreinterpretq_u32_u64, vreinterpretq_u32_u8, vreinterpretq_u64_s16, vreinterpretq_u64_s32,
vreinterpretq_u64_s64, vreinterpretq_u64_s8, vreinterpretq_u64_u16, vreinterpretq_u64_u32,
vreinterpretq_u64_u8, vreinterpretq_u8_s16, vreinterpretq_u8_s32, vreinterpretq_u8_s64,
vreinterpretq_u8_s8, vreinterpretq_u8_u16, vreinterpretq_u8_u32, vreinterpretq_u8_u64,
vreinterpretq_s32_f16, vreinterpretq_s32_f32, vreinterpretq_u16_f16, vreinterpretq_u16_f32,
vreinterpretq_u32_f16, vreinterpretq_u32_f32, vreinterpretq_u64_f16, vreinterpretq_u64_f32,
vreinterpretq_u8_f16, vreinterpretq_u8_f32, vreinterpretq_f16_f32, vreinterpretq_f16_s16,
vreinterpretq_f16_s32, vreinterpretq_f16_s64, vreinterpretq_f16_s8, vreinterpretq_f16_u16,
vreinterpretq_f16_u32, vreinterpretq_f16_u64, vreinterpretq_f16_u8, vreinterpretq_f32_f16,
vreinterpretq_f32_s16, vreinterpretq_f32_s32, vreinterpretq_f32_s64, vreinterpretq_f32_s8,
vreinterpretq_f32_u16, vreinterpretq_f32_u32, vreinterpretq_f32_u64, vreinterpretq_f32_u8,
vreinterpretq_s16_f16, vreinterpretq_s16_f32, vreinterpretq_s64_f16, vreinterpretq_s64_f32,
vreinterpretq_s8_f16, vreinterpretq_s8_f32, vuninitializedq_u8, vuninitializedq_u16,
vuninitializedq_u32, vuninitializedq_u64, vuninitializedq_s8, vuninitializedq_s16,
vuninitializedq_s32, vuninitializedq_s64, vuninitializedq_f16, vuninitializedq_f32 and
vuninitializedq.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vreinterpretq_s16_s32): Define macro.
(vreinterpretq_s16_s64): Likewise.
(vreinterpretq_s16_s8): Likewise.
(vreinterpretq_s16_u16): Likewise.
(vreinterpretq_s16_u32): Likewise.
(vreinterpretq_s16_u64): Likewise.
(vreinterpretq_s16_u8): Likewise.
(vreinterpretq_s32_s16): Likewise.
(vreinterpretq_s32_s64): Likewise.
(vreinterpretq_s32_s8): Likewise.
(vreinterpretq_s32_u16): Likewise.
(vreinterpretq_s32_u32): Likewise.
(vreinterpretq_s32_u64): Likewise.
(vreinterpretq_s32_u8): Likewise.
(vreinterpretq_s64_s16): Likewise.
(vreinterpretq_s64_s32): Likewise.
(vreinterpretq_s64_s8): Likewise.
(vreinterpretq_s64_u16): Likewise.
(vreinterpretq_s64_u32): Likewise.
(vreinterpretq_s64_u64): Likewise.
(vreinterpretq_s64_u8): Likewise.
(vreinterpretq_s8_s16): Likewise.
(vreinterpretq_s8_s32): Likewise.
(vreinterpretq_s8_s64): Likewise.
(vreinterpretq_s8_u16): Likewise.
(vreinterpretq_s8_u32): Likewise.
(vreinterpretq_s8_u64): Likewise.
(vreinterpretq_s8_u8): Likewise.
(vreinterpretq_u16_s16): Likewise.
(vreinterpretq_u16_s32): Likewise.
(vreinterpretq_u16_s64): Likewise.
(vreinterpretq_u16_s8): Likewise.
(vreinterpretq_u16_u32): Likewise.
(vreinterpretq_u16_u64): Likewise.
(vreinterpretq_u16_u8): Likewise.
(vreinterpretq_u32_s16): Likewise.
(vreinterpretq_u32_s32): Likewise.
(vreinterpretq_u32_s64): Likewise.
(vreinterpretq_u32_s8): Likewise.
(vreinterpretq_u32_u16): Likewise.
(vreinterpretq_u32_u64): Likewise.
(vreinterpretq_u32_u8): Likewise.
(vreinterpretq_u64_s16): Likewise.
(vreinterpretq_u64_s32): Likewise.
(vreinterpretq_u64_s64): Likewise.
(vreinterpretq_u64_s8): Likewise.
(vreinterpretq_u64_u16): Likewise.
(vreinterpretq_u64_u32): Likewise.
(vreinterpretq_u64_u8): Likewise.
(vreinterpretq_u8_s16): Likewise.
(vreinterpretq_u8_s32): Likewise.
(vreinterpretq_u8_s64): Likewise.
(vreinterpretq_u8_s8): Likewise.
(vreinterpretq_u8_u16): Likewise.
(vreinterpretq_u8_u32): Likewise.
(vreinterpretq_u8_u64): Likewise.
(vreinterpretq_s32_f16): Likewise.
(vreinterpretq_s32_f32): Likewise.
(vreinterpretq_u16_f16): Likewise.
(vreinterpretq_u16_f32): Likewise.
(vreinterpretq_u32_f16): Likewise.
(vreinterpretq_u32_f32): Likewise.
(vreinterpretq_u64_f16): Likewise.
(vreinterpretq_u64_f32): Likewise.
(vreinterpretq_u8_f16): Likewise.
(vreinterpretq_u8_f32): Likewise.
(vreinterpretq_f16_f32): Likewise.
(vreinterpretq_f16_s16): Likewise.
(vreinterpretq_f16_s32): Likewise.
(vreinterpretq_f16_s64): Likewise.
(vreinterpretq_f16_s8): Likewise.
(vreinterpretq_f16_u16): Likewise.
(vreinterpretq_f16_u32): Likewise.
(vreinterpretq_f16_u64): Likewise.
(vreinterpretq_f16_u8): Likewise.
(vreinterpretq_f32_f16): Likewise.
(vreinterpretq_f32_s16): Likewise.
(vreinterpretq_f32_s32): Likewise.
(vreinterpretq_f32_s64): Likewise.
(vreinterpretq_f32_s8): Likewise.
(vreinterpretq_f32_u16): Likewise.
(vreinterpretq_f32_u32): Likewise.
(vreinterpretq_f32_u64): Likewise.
(vreinterpretq_f32_u8): Likewise.
(vreinterpretq_s16_f16): Likewise.
(vreinterpretq_s16_f32): Likewise.
(vreinterpretq_s64_f16): Likewise.
(vreinterpretq_s64_f32): Likewise.
(vreinterpretq_s8_f16): Likewise.
(vreinterpretq_s8_f32): Likewise.
(vuninitializedq_u8): Likewise.
(vuninitializedq_u16): Likewise.
(vuninitializedq_u32): Likewise.
(vuninitializedq_u64): Likewise.
(vuninitializedq_s8): Likewise.
(vuninitializedq_s16): Likewise.
(vuninitializedq_s32): Likewise.
(vuninitializedq_s64): Likewise.
(vuninitializedq_f16): Likewise.
(vuninitializedq_f32): Likewise.
(__arm_vuninitializedq_u8): Define intrinsic.
(__arm_vuninitializedq_u16): Likewise.
(__arm_vuninitializedq_u32): Likewise.
(__arm_vuninitializedq_u64): Likewise.
(__arm_vuninitializedq_s8): Likewise.
(__arm_vuninitializedq_s16): Likewise.
(__arm_vuninitializedq_s32): Likewise.
(__arm_vuninitializedq_s64): Likewise.
(__arm_vreinterpretq_s16_s32): Likewise.
(__arm_vreinterpretq_s16_s64): Likewise.
(__arm_vreinterpretq_s16_s8): Likewise.
(__arm_vreinterpretq_s16_u16): Likewise.
(__arm_vreinterpretq_s16_u32): Likewise.
(__arm_vreinterpretq_s16_u64): Likewise.
(__arm_vreinterpretq_s16_u8): Likewise.
(__arm_vreinterpretq_s32_s16): Likewise.
(__arm_vreinterpretq_s32_s64): Likewise.
(__arm_vreinterpretq_s32_s8): Likewise.
(__arm_vreinterpretq_s32_u16): Likewise.
(__arm_vreinterpretq_s32_u32): Likewise.
(__arm_vreinterpretq_s32_u64): Likewise.
(__arm_vreinterpretq_s32_u8): Likewise.
(__arm_vreinterpretq_s64_s16): Likewise.
(__arm_vreinterpretq_s64_s32): Likewise.
(__arm_vreinterpretq_s64_s8): Likewise.
(__arm_vreinterpretq_s64_u16): Likewise.
(__arm_vreinterpretq_s64_u32): Likewise.
(__arm_vreinterpretq_s64_u64): Likewise.
(__arm_vreinterpretq_s64_u8): Likewise.
(__arm_vreinterpretq_s8_s16): Likewise.
(__arm_vreinterpretq_s8_s32): Likewise.
(__arm_vreinterpretq_s8_s64): Likewise.
(__arm_vreinterpretq_s8_u16): Likewise.
(__arm_vreinterpretq_s8_u32): Likewise.
(__arm_vreinterpretq_s8_u64): Likewise.
(__arm_vreinterpretq_s8_u8): Likewise.
(__arm_vreinterpretq_u16_s16): Likewise.
(__arm_vreinterpretq_u16_s32): Likewise.
(__arm_vreinterpretq_u16_s64): Likewise.
(__arm_vreinterpretq_u16_s8): Likewise.
(__arm_vreinterpretq_u16_u32): Likewise.
(__arm_vreinterpretq_u16_u64): Likewise.
(__arm_vreinterpretq_u16_u8): Likewise.
(__arm_vreinterpretq_u32_s16): Likewise.
(__arm_vreinterpretq_u32_s32): Likewise.
(__arm_vreinterpretq_u32_s64): Likewise.
(__arm_vreinterpretq_u32_s8): Likewise.
(__arm_vreinterpretq_u32_u16): Likewise.
(__arm_vreinterpretq_u32_u64): Likewise.
(__arm_vreinterpretq_u32_u8): Likewise.
(__arm_vreinterpretq_u64_s16): Likewise.
(__arm_vreinterpretq_u64_s32): Likewise.
(__arm_vreinterpretq_u64_s64): Likewise.
(__arm_vreinterpretq_u64_s8): Likewise.
(__arm_vreinterpretq_u64_u16): Likewise.
(__arm_vreinterpretq_u64_u32): Likewise.
(__arm_vreinterpretq_u64_u8): Likewise.
(__arm_vreinterpretq_u8_s16): Likewise.
(__arm_vreinterpretq_u8_s32): Likewise.
(__arm_vreinterpretq_u8_s64): Likewise.
(__arm_vreinterpretq_u8_s8): Likewise.
(__arm_vreinterpretq_u8_u16): Likewise.
(__arm_vreinterpretq_u8_u32): Likewise.
(__arm_vreinterpretq_u8_u64): Likewise.
(__arm_vuninitializedq_f16): Likewise.
(__arm_vuninitializedq_f32): Likewise.
(__arm_vreinterpretq_s32_f16): Likewise.
(__arm_vreinterpretq_s32_f32): Likewise.
(__arm_vreinterpretq_s16_f16): Likewise.
(__arm_vreinterpretq_s16_f32): Likewise.
(__arm_vreinterpretq_s64_f16): Likewise.
(__arm_vreinterpretq_s64_f32): Likewise.
(__arm_vreinterpretq_s8_f16): Likewise.
(__arm_vreinterpretq_s8_f32): Likewise.
(__arm_vreinterpretq_u16_f16): Likewise.
(__arm_vreinterpretq_u16_f32): Likewise.
(__arm_vreinterpretq_u32_f16): Likewise.
(__arm_vreinterpretq_u32_f32): Likewise.
(__arm_vreinterpretq_u64_f16): Likewise.
(__arm_vreinterpretq_u64_f32): Likewise.
(__arm_vreinterpretq_u8_f16): Likewise.
(__arm_vreinterpretq_u8_f32): Likewise.
(__arm_vreinterpretq_f16_f32): Likewise.
(__arm_vreinterpretq_f16_s16): Likewise.
(__arm_vreinterpretq_f16_s32): Likewise.
(__arm_vreinterpretq_f16_s64): Likewise.
(__arm_vreinterpretq_f16_s8): Likewise.
(__arm_vreinterpretq_f16_u16): Likewise.
(__arm_vreinterpretq_f16_u32): Likewise.
(__arm_vreinterpretq_f16_u64): Likewise.
(__arm_vreinterpretq_f16_u8): Likewise.
(__arm_vreinterpretq_f32_f16): Likewise.
(__arm_vreinterpretq_f32_s16): Likewise.
(__arm_vreinterpretq_f32_s32): Likewise.
(__arm_vreinterpretq_f32_s64): Likewise.
(__arm_vreinterpretq_f32_s8): Likewise.
(__arm_vreinterpretq_f32_u16): Likewise.
(__arm_vreinterpretq_f32_u32): Likewise.
(__arm_vreinterpretq_f32_u64): Likewise.
(__arm_vreinterpretq_f32_u8): Likewise.
(vuninitializedq): Define polymorphic variant.
(vreinterpretq_f16): Likewise.
(vreinterpretq_f32): Likewise.
(vreinterpretq_s16): Likewise.
(vreinterpretq_s32): Likewise.
(vreinterpretq_s64): Likewise.
(vreinterpretq_s8): Likewise.
(vreinterpretq_u16): Likewise.
(vreinterpretq_u32): Likewise.
(vreinterpretq_u64): Likewise.
(vreinterpretq_u8): Likewise.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: New test.
* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vreinterpretq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_float.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_float1.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_int.c: Likewise.
* gcc.target/arm/mve/intrinsics/vuninitializedq_int1.c: Likewise.
Srinath Parvathaneni [Fri, 20 Mar 2020 11:44:08 +0000 (11:44 +0000)]
[ARM][GCC][6x]:MVE ACLE vaddq intrinsics using arithmetic plus operator.
This patch supports following MVE ACLE vaddq intrinsics. The RTL patterns for this intrinsics are added using arithmetic "plus" operator.
vaddq_s8, vaddq_s16, vaddq_s32, vaddq_u8, vaddq_u16, vaddq_u32, vaddq_f16, vaddq_f32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* config/arm/arm_mve.h (vaddq_s8): Define macro.
(vaddq_s16): Likewise.
(vaddq_s32): Likewise.
(vaddq_u8): Likewise.
(vaddq_u16): Likewise.
(vaddq_u32): Likewise.
(vaddq_f16): Likewise.
(vaddq_f32): Likewise.
(__arm_vaddq_s8): Define intrinsic.
(__arm_vaddq_s16): Likewise.
(__arm_vaddq_s32): Likewise.
(__arm_vaddq_u8): Likewise.
(__arm_vaddq_u16): Likewise.
(__arm_vaddq_u32): Likewise.
(__arm_vaddq_f16): Likewise.
(__arm_vaddq_f32): Likewise.
(vaddq): Define polymorphic variant.
* config/arm/iterators.md (VNIM): Define mode iterator for common types
Neon, IWMMXT and MVE.
(VNINOTM): Likewise.
* config/arm/mve.md (mve_vaddq<mode>): Define RTL pattern.
(mve_vaddq_f<mode>): Define RTL pattern.
* config/arm/neon.md (add<mode>3): Rename to addv4hf3 RTL pattern.
(addv8hf3_neon): Define RTL pattern.
* config/arm/vec-common.md (add<mode>3): Modify standard add RTL pattern
to support MVE.
(addv8hf3): Define standard RTL pattern for MVE and Neon.
(add<mode>3): Modify existing standard add RTL pattern for Neon and IWMMXT.
gcc/testsuite/ChangeLog:
2020-03-20 Srinath Parvathaneni <srinath.parvathaneni@arm.com>
Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
* gcc.target/arm/mve/intrinsics/vaddq_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vaddq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_u8.c: Likewise.
Martin Liska [Fri, 20 Mar 2020 10:01:13 +0000 (11:01 +0100)]
Fix correct offset in ipa_get_jf_ancestor_result.
PR ipa/94232
* ipa-cp.c (ipa_get_jf_ancestor_result): Use offset in bytes. Previously
build_ref_for_offset function was used and it transforms off to bytes
from bits.
Richard Biener [Fri, 20 Mar 2020 09:52:02 +0000 (10:52 +0100)]
tree-optimization/94266 - fix object type extraction heuristics
This fixes the heuristic deriving an actual object type from a
MEM_REFs pointer operand to use the more sensible type of an
actual object instead of the pointed to type.
2020-03-20 Richard Biener <rguenther@suse.de>
PR tree-optimization/94266
* gimple-ssa-sprintf.c (get_origin_and_offset): Use the
type of the underlying object to adjust for the containing
field if available.
Andre Simoes Dias Vieira [Fri, 20 Mar 2020 09:18:18 +0000 (09:18 +0000)]
gcc, Arm: Revert changes to {get,set}_fpscr
MVE made changes to {get,set}_fpscr to enable the compiler to optimize
unneccesary gets and sets when using these for intrinsics that use and/or write
the carry bit. However, these actually get and set the full FPSCR register and
are used by fp env intrinsics to modify the fp context. So MVE should not be
using these.
gcc/ChangeLog:
2020-03-20 Andre Vieira <andre.simoesdiasvieira@arm.com>
* config/arm/unspecs.md (UNSPEC_GET_FPSCR): Rename this to ...
(VUNSPEC_GET_FPSCR): ... this, and move it to vunspec.
* config/arm/vfp.md: (get_fpscr, set_fpscr): Revert to old patterns.
Andre Simoes Dias Vieira [Fri, 20 Mar 2020 09:10:17 +0000 (09:10 +0000)]
gcc, Arm: Fix testisms for MVE testsuite
This patch fixes some testism where -mfpu=auto was missing or where we could
end up with -mfloat-abi=hard and soft on the same command-line.
gcc/testsuite/ChangeLog:
2020-03-20 Andre Vieira <andre.simoesdiasvieira@arm.com>
* gcc.target/arm/mve/intrinsics/mve_fp_fpu1.c: Fix testisms.
* gcc.target/arm/mve/intrinsics/mve_fp_fpu2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_fpu3.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_libcall1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_libcall2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_float.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_float1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_float2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_int2.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint1.c: Likewise.
* gcc.target/arm/mve/intrinsics/mve_vector_uint2.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
Andre Simoes Dias Vieira [Fri, 20 Mar 2020 09:07:10 +0000 (09:07 +0000)]
gcc, Arm: Fix MVE move from GPR -> GPR
This patch fixes the pattern mve_mov for the case where both MVE vectors are in
R registers and the move does not get optimized away. I use the same approach
as we do for NEON, where we use four register moves.
gcc/ChangeLog:
2020-03-20 Andre Vieira <andre.simoesdiasvieira@arm.com>
* config/arm/mve.md (mve_mov<mode>): Fix R->R case.
gcc/testsuite/ChangeLog:
2020-03-20 Andre Vieira <andre.simoesdiasvieira@arm.com>
* gcc.target/arm/mve/intrinsics/mve_move_gpr_to_gpr.c: New test.
Jakub Jelinek [Fri, 20 Mar 2020 08:33:38 +0000 (09:33 +0100)]
store-merging: Fix up -fnon-call-exceptions handling [PR94224]
When we are adding a single store into a store group, we are already
checking that store->lp_nr matches, but we have also code to add further
INTEGER_CST stores into the group right away if the ordering requires that
either we put there all or none from a certain set of stores. And in those
cases we weren't doing these lp_nr checks, which means we could end up with
stores with different lp_nr in the same group, which then ICEs during
output_merged_store.
2020-03-20 Jakub Jelinek <jakub@redhat.com>
PR tree-optimization/94224
* gimple-ssa-store-merging.c
(imm_store_chain_info::coalesce_immediate): Don't consider overlapping
or adjacent INTEGER_CST rhs_code stores as mergeable if they have
different lp_nr.
* g++.dg/tree-ssa/pr94224.C: New test.
Andre Simoes Dias Vieira [Fri, 20 Mar 2020 08:25:56 +0000 (08:25 +0000)]
gcc, Arm: Fix no_cond issue introduced by MVE
This was a matter of mistaken logic in (define_attr "conds" ..). This was
setting the conds attribute for any neon instruction to no_cond which was
messing up code generation.
gcc/ChangeLog:
2020-03-20 Andre Vieira <andre.simoesdiasvieira@arm.com>
* config/arm/arm.md (define_attr "conds"): Fix logic for neon and mve.
Bin Bin Lv [Thu, 19 Mar 2020 16:10:44 +0000 (12:10 -0400)]
[rs6000] Rewrite the declaration of a variable
Rewrite the declaration of toc_section from the source file rs6000.c to its
header file for standardizing the code.
Bootstrap and regression were done on powerpc64le-linux-gnu (LE) with no
regressions.
gcc/ChangeLog
2020-03-20 Bin Bin Lv <shlb.linux.ibm.com>
* config/rs6000/rs6000-internal.h (toc_section): Remove the
declaration.
* config/rs6000/rs6000.h (toc_section): Add the declaration.
* config/rs6000/rs6000.c (toc_section): Remove the declaration.
Jason Merrill [Thu, 19 Mar 2020 15:06:52 +0000 (11:06 -0400)]
c++: Avoid unnecessary empty class copy [94175].
A simple empty class copy is still simple when wrapped in a TARGET_EXPR, so
we need to strip that as well. This change also exposed some unnecessary
copies in return statements, which when returning by invisible reference led
to <RETURN_EXPR <MEM_REF <RESULT_DECL>>>, which gimplify_return_expr didn't
like. So we also need to strip the _REF when we eliminate the INIT_EXPR.
gcc/cp/ChangeLog
2020-03-19 Jason Merrill <jason@redhat.com>
PR c++/94175
* cp-gimplify.c (simple_empty_class_p): Look through
SIMPLE_TARGET_EXPR_P.
(cp_gimplify_expr) [MODIFY_EXPR]: Likewise.
[RETURN_EXPR]: Avoid producing 'return *retval;'.
* call.c (build_call_a): Strip TARGET_EXPR from empty class arg.
* cp-tree.h (SIMPLE_TARGET_EXPR_P): Check that TARGET_EXPR_INITIAL
is non-null.
GCC Administrator [Fri, 20 Mar 2020 00:16:30 +0000 (00:16 +0000)]
Daily bump.
Jan Hubicka [Thu, 19 Mar 2020 23:42:13 +0000 (00:42 +0100)]
Fix cgraph_node::function_symbol availability compuattion [PR94202]
this fixes ICE in inliner cache sanity check which is caused by very old
bug in visibility calculation in cgraph_node::function_symbol and
cgraph_node::function_or_virtual_thunk_symbol.
In the testcase there is indirect call to a thunk. At begining we correctly
see that its body as AVAIL_AVAILABLE but later we inline into the thunk and
this turns it to AVAIL_INTERPOSABLE.
This is because function_symbol incorrectly overwrites availability parameter
by availability of the alias used in the call within thunk, which is a local
alias.
gcc/ChangeLog:
2020-03-19 Jan Hubicka <hubicka@ucw.cz>
PR ipa/94202
* cgraph.c (cgraph_node::function_symbol): Fix availability computation.
(cgraph_node::function_or_virtual_thunk_symbol): Likewise.
gcc/testsuite/ChangeLog:
2020-03-19 Jan Hubicka <hubicka@ucw.cz>
PR ipa/94202
* g++.dg/torture/pr94202.C: New test.
Jakub Jelinek [Thu, 19 Mar 2020 21:56:20 +0000 (22:56 +0100)]
c: Fix up cfun->function_end_locus from the C FE [PR94029]
On the following testcase we ICE because while
DECL_STRUCT_FUNCTION (current_function_decl)->function_start_locus
= c_parser_peek_token (parser)->location;
and similarly DECL_SOURCE_LOCATION (fndecl) is set from some token's
location, the end is set as:
/* Store the end of the function, so that we get good line number
info for the epilogue. */
cfun->function_end_locus = input_location;
and the thing is that input_location is only very rarely set in the C FE
(the primary spot that changes it is the cb_line_change/fe_file_change).
Which means, e.g. for pretty much all C functions that are on a single line,
function_start_locus column is > than function_end_locus column, and the
testcase even has smaller line in function_end_locus because cb_line_change
isn't performed while parsing multi-line arguments of a function-like macro.
Attached are two possible fixes to achieve what the C++ FE does, in
particular that cfun->function_end_locus is the locus of the closing } of
the function. The first one updates input_location when we see a closing }
of a compound statement (though any, not just the function body) and thus
input_location in the finish_function call is what we need.
The second instead propagates the location_t from the parsing of the
outermost compound statement (the function body) to finish_function.
The second one is this version.
2020-03-19 Jakub Jelinek <jakub@redhat.com>
PR gcov-profile/94029
* c-tree.h (finish_function): Add location_t argument defaulted to
input_location.
* c-parser.c (c_parser_compound_statement): Add endlocp argument and
set it to the locus of closing } if non-NULL.
(c_parser_compound_statement_nostart): Return locus of closing }.
(c_parser_parse_rtl_body): Likewise.
(c_parser_declaration_or_fndef): Propagate locus of closing } to
finish_function.
* c-decl.c (finish_function): Add end_loc argument, use it instead of
input_location to set function_end_locus.
* gcc.misc-tests/gcov-pr94029.c: New test.
Iain Buclaw [Tue, 17 Mar 2020 18:33:14 +0000 (19:33 +0100)]
d/dmd: Merge upstream dmd
d1a606599
Fixes long standing regression in the D front-end implemention, and adds
a new field to allow retrieving a list of all content imports from the
code generator.
Reviewed-on: https://github.com/dlang/dmd/pull/10913
https://github.com/dlang/dmd/pull/10933
Jan Hubicka [Thu, 19 Mar 2020 16:12:56 +0000 (17:12 +0100)]
Fix inliner ICE on alias with flatten attribute [PR92372]
gcc/ChangeLog:
2020-03-19 Jan Hubicka <hubicka@ucw.cz>
PR ipa/92372
* cgraphunit.c (process_function_and_variable_attributes): warn
for flatten attribute on alias.
* ipa-inline.c (ipa_inline): Do not ICE on flatten attribute on alias.
gcc/testsuite/ChangeLog:
2020-03-19 Jan Hubicka <hubicka@ucw.cz>
PR ipa/92372
* gcc.c-torture/pr92372.c: New test.
* gcc.dg/attr-flatten-1.c: New test.
Martin Liska [Thu, 19 Mar 2020 15:56:27 +0000 (16:56 +0100)]
API extension for binutils (type of symbols).
* lto-section-in.c: Add ext_symtab.
* lto-streamer-out.c (write_symbol_extension_info): New.
(produce_symtab_extension): New.
(produce_asm_for_decls): Stream also produce_symtab_extension.
* lto-streamer.h (enum lto_section_type): New section.
* lto-symtab.h (enum gcc_plugin_symbol_type): New.
(enum gcc_plugin_symbol_section_kind): Likewise.
* lto-plugin.c (LTO_SECTION_PREFIX): Rename to ...
(LTO_SYMTAB_PREFIX): ... this.
(LTO_SECTION_PREFIX_LEN): Rename to ...
(LTO_SYMTAB_PREFIX_LEN): ... this.
(LTO_SYMTAB_EXT_PREFIX): New.
(LTO_SYMTAB_EXT_PREFIX_LEN): New.
(LTO_LTO_PREFIX): New.
(LTO_LTO_PREFIX_LEN): New.
(parse_table_entry): Fill up unused to zero.
(parse_table_entry_extension): New.
(parse_symtab_extension): New.
(finish_conflict_resolution): Change type
for resolution.
(process_symtab): Use new macro name.
(process_symtab_extension): New.
(claim_file_handler): Parse also process_symtab_extension.
(onload): Call new add_symbols_v2.
Martin Liska [Thu, 19 Mar 2020 15:55:59 +0000 (16:55 +0100)]
Update include/plugin-api.h.
* plugin-api.h (struct ld_plugin_symbol): Split
int def into 4 char fields.
(enum ld_plugin_symbol_type): New.
(enum ld_plugin_symbol_section_kind): New.
(enum ld_plugin_tag): Add LDPT_ADD_SYMBOLS_V2.
Jakub Jelinek [Thu, 19 Mar 2020 11:22:47 +0000 (12:22 +0100)]
c++: Fix up handling of captured vars in lambdas in OpenMP clauses [PR93931]
Without the parser.c change we were ICEing on the testcase, because while the
uses of the captured vars inside of the constructs were replaced with capture
proxy decls, we didn't do that for decls in OpenMP clauses.
With that fixed, we don't ICE anymore, but the testcase is miscompiled and FAILs
at runtime. This is because the capture proxy decls have DECL_VALUE_EXPR and
during gimplification we were gimplifying those to their DECL_VALUE_EXPRs.
That is fine for shared vars, but for privatized ones we must not do that.
So that is what the cp-gimplify.c changes do. Had to add a DECL_CONTEXT check
before calling is_capture_proxy because some VAR_DECLs don't have DECL_CONTEXT
set (yet) and is_capture_proxy relies on that being non-NULL always.
2020-03-19 Jakub Jelinek <jakub@redhat.com>
PR c++/93931
* parser.c (cp_parser_omp_var_list_no_open): Call process_outer_var_ref
on outer_automatic_var_p decls.
* cp-gimplify.c (cxx_omp_disregard_value_expr): Return true also for
capture proxy decls.
* testsuite/libgomp.c++/pr93931.C: New test.
Tobias Burnus [Thu, 19 Mar 2020 10:42:49 +0000 (11:42 +0100)]
libgomp/testsuite: ignore blank-line output for function-not-offloaded.c
* testsuite/libgomp.c-c++-common/function-not-offloaded.c: Add
dg-allow-blank-lines-in-output.
Jakub Jelinek [Thu, 19 Mar 2020 09:24:16 +0000 (10:24 +0100)]
phiopt: Avoid -fcompare-debug bug in phiopt [PR94211]
Two years ago, I've added support for up to 2 simple preparation statements
in value_replacement, but the
- && estimate_num_insns (assign, &eni_time_weights)
+ && estimate_num_insns (bb_seq (middle_bb), &eni_time_weights)
change, meant that we compute the cost of all those statements rather than
just the single assign that has been the single supported non-debug
statement in the bb before, doesn't do what I thought would do, gimple_seq
is just gimple * and thus it can't be really overloaded depending on whether
we pass a single gimple * or a whole sequence. Which means in the last
two years it doesn't count all the statements, but only the first one.
With -g that happens to be a DEBUG_STMT, or it could be e.g. the first
preparation statement which could be much cheaper than the actual assign.
2020-03-19 Jakub Jelinek <jakub@redhat.com>
PR tree-optimization/94211
* tree-ssa-phiopt.c (value_replacement): Use estimate_num_insns_seq
instead of estimate_num_insns for bb_seq (middle_bb). Rename
emtpy_or_with_defined_p variable to empty_or_with_defined_p, adjust
all uses.
* gcc.dg/pr94211.c: New test.
Richard Biener [Thu, 19 Mar 2020 09:19:24 +0000 (10:19 +0100)]
ipa/94217 simplify offsetted address build
This avoids using build_ref_for_offset and build_fold_addr_expr
where type mixup easily results in something not IP invariant.
2020-03-19 Richard Biener <rguenther@suse.de>
PR ipa/94217
* ipa-cp.c (ipa_get_jf_ancestor_result): Avoid build_fold_addr_expr
and build_ref_for_offset.
Richard Biener [Thu, 19 Mar 2020 09:15:52 +0000 (10:15 +0100)]
middle-end/94216 fix another build_fold_addr_expr use
2020-03-19 Richard Biener <rguenther@suse.de>
PR middle-end/94216
* fold-const.c (fold_binary_loc): Avoid using
build_fold_addr_expr when we really want an ADDR_EXPR.
* g++.dg/torture/pr94216.C: New testcase.
GCC Administrator [Thu, 19 Mar 2020 00:16:20 +0000 (00:16 +0000)]
Daily bump.
Jonathan Wakely [Wed, 18 Mar 2020 23:19:12 +0000 (23:19 +0000)]
libstdc++: Fix is_trivially_constructible (PR 94033)
This attempts to make is_nothrow_constructible more robust (and
efficient to compile) by not depending on is_constructible. Instead the
__is_constructible intrinsic is used directly. The helper class
__is_nt_constructible_impl which checks whether the construction is
non-throwing now takes a bool template parameter that is substituted by
the result of the instrinsic. This fixes the reported bug by not using
the already-instantiated (and incorrect) value of std::is_constructible.
I don't think it really fixes the problem in general, because
std::is_nothrow_constructible itself could already have been
instantiated in a context where it gives the wrong result. A proper fix
needs to be done in the compiler.
PR libstdc++/94033
* include/std/type_traits (__is_nt_default_constructible_atom): Remove.
(__is_nt_default_constructible_impl): Remove.
(__is_nothrow_default_constructible_impl): Remove.
(__is_nt_constructible_impl): Add bool template parameter. Adjust
partial specializations.
(__is_nothrow_constructible_impl): Replace class template with alias
template.
(is_nothrow_default_constructible): Derive from alias template
__is_nothrow_constructible_impl instead of
__is_nothrow_default_constructible_impl.
* testsuite/20_util/is_nothrow_constructible/94003.cc: New test.
Segher Boessenkool [Wed, 18 Mar 2020 21:58:45 +0000 (21:58 +0000)]
rs6000: Add back some w* constraints (PR91886)
In May and June last year I deleted many of our (vector) constraints.
We can now just use "wa" for those, together with some other
conditions, which can be per alternative using the "enabled" attribute
(which in turn primarily uses the "isa" attribute).
But, it turns out that Clang implements some of those constraints as
well, and at least musl uses some of them. It is easy for us to add
those contraints back (as undocumented aliases to "wa", which always
did mean the same thing for valid inline assembler code), so do that.
gcc/
* config/rs6000/constraints.md (wd, wf, wi, ws, ww): New undocumented
aliases for "wa".
Jeff Law [Wed, 18 Mar 2020 22:07:28 +0000 (16:07 -0600)]
Complete change to resolve pr90275.
PR rtl-optimization/90275
* cse.c (cse_insn): Delete no-op register moves too.
Martin Sebor [Wed, 18 Mar 2020 20:47:29 +0000 (14:47 -0600)]
PR ipa/92799 - ICE on a weakref function definition followed by a declaration
gcc/testsuite/ChangeLog:
PR ipa/92799
* gcc.dg/attr-weakref-5.c: New test.
gcc/ChangeLog:
PR ipa/92799
* cgraphunit.c (process_function_and_variable_attributes): Also
complain about weakref function definitions and drop all effects
of the attribute.
Srinath Parvathaneni [Wed, 18 Mar 2020 19:16:32 +0000 (19:16 +0000)]
[ARM][GCC][8/5x]: Remaining MVE store intrinsics which stores an half word, word and double word to memory.
This patch supports the following MVE ACLE store intrinsics which stores an halfword, word or double word to memory.
vstrdq_scatter_base_p_s64, vstrdq_scatter_base_p_u64, vstrdq_scatter_base_s64, vstrdq_scatter_base_u64, vstrdq_scatter_offset_p_s64, vstrdq_scatter_offset_p_u64, vstrdq_scatter_offset_s64, vstrdq_scatter_offset_u64, vstrdq_scatter_shifted_offset_p_s64,
vstrdq_scatter_shifted_offset_p_u64, vstrdq_scatter_shifted_offset_s64,
vstrdq_scatter_shifted_offset_u64, vstrhq_scatter_offset_f16, vstrhq_scatter_offset_p_f16, vstrhq_scatter_shifted_offset_f16, vstrhq_scatter_shifted_offset_p_f16,
vstrwq_scatter_base_f32, vstrwq_scatter_base_p_f32, vstrwq_scatter_offset_f32, vstrwq_scatter_offset_p_f32, vstrwq_scatter_offset_p_s32, vstrwq_scatter_offset_p_u32, vstrwq_scatter_offset_s32, vstrwq_scatter_offset_u32, vstrwq_scatter_shifted_offset_f32,
vstrwq_scatter_shifted_offset_p_f32, vstrwq_scatter_shifted_offset_p_s32,
vstrwq_scatter_shifted_offset_p_u32, vstrwq_scatter_shifted_offset_s32,
vstrwq_scatter_shifted_offset_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch a new predicate "Ri" is defined to check the immediate is in the range of +/-1016 and multiple of 8.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vstrdq_scatter_base_p_s64): Define macro.
(vstrdq_scatter_base_p_u64): Likewise.
(vstrdq_scatter_base_s64): Likewise.
(vstrdq_scatter_base_u64): Likewise.
(vstrdq_scatter_offset_p_s64): Likewise.
(vstrdq_scatter_offset_p_u64): Likewise.
(vstrdq_scatter_offset_s64): Likewise.
(vstrdq_scatter_offset_u64): Likewise.
(vstrdq_scatter_shifted_offset_p_s64): Likewise.
(vstrdq_scatter_shifted_offset_p_u64): Likewise.
(vstrdq_scatter_shifted_offset_s64): Likewise.
(vstrdq_scatter_shifted_offset_u64): Likewise.
(vstrhq_scatter_offset_f16): Likewise.
(vstrhq_scatter_offset_p_f16): Likewise.
(vstrhq_scatter_shifted_offset_f16): Likewise.
(vstrhq_scatter_shifted_offset_p_f16): Likewise.
(vstrwq_scatter_base_f32): Likewise.
(vstrwq_scatter_base_p_f32): Likewise.
(vstrwq_scatter_offset_f32): Likewise.
(vstrwq_scatter_offset_p_f32): Likewise.
(vstrwq_scatter_offset_p_s32): Likewise.
(vstrwq_scatter_offset_p_u32): Likewise.
(vstrwq_scatter_offset_s32): Likewise.
(vstrwq_scatter_offset_u32): Likewise.
(vstrwq_scatter_shifted_offset_f32): Likewise.
(vstrwq_scatter_shifted_offset_p_f32): Likewise.
(vstrwq_scatter_shifted_offset_p_s32): Likewise.
(vstrwq_scatter_shifted_offset_p_u32): Likewise.
(vstrwq_scatter_shifted_offset_s32): Likewise.
(vstrwq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrdq_scatter_base_p_s64): Define intrinsic.
(__arm_vstrdq_scatter_base_p_u64): Likewise.
(__arm_vstrdq_scatter_base_s64): Likewise.
(__arm_vstrdq_scatter_base_u64): Likewise.
(__arm_vstrdq_scatter_offset_p_s64): Likewise.
(__arm_vstrdq_scatter_offset_p_u64): Likewise.
(__arm_vstrdq_scatter_offset_s64): Likewise.
(__arm_vstrdq_scatter_offset_u64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_p_s64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_p_u64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_s64): Likewise.
(__arm_vstrdq_scatter_shifted_offset_u64): Likewise.
(__arm_vstrwq_scatter_offset_p_s32): Likewise.
(__arm_vstrwq_scatter_offset_p_u32): Likewise.
(__arm_vstrwq_scatter_offset_s32): Likewise.
(__arm_vstrwq_scatter_offset_u32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_s32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_u32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_s32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrhq_scatter_offset_f16): Likewise.
(__arm_vstrhq_scatter_offset_p_f16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_f16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_f16): Likewise.
(__arm_vstrwq_scatter_base_f32): Likewise.
(__arm_vstrwq_scatter_base_p_f32): Likewise.
(__arm_vstrwq_scatter_offset_f32): Likewise.
(__arm_vstrwq_scatter_offset_p_f32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_f32): Likewise.
(__arm_vstrwq_scatter_shifted_offset_p_f32): Likewise.
(vstrhq_scatter_offset): Define polymorphic variant.
(vstrhq_scatter_offset_p): Likewise.
(vstrhq_scatter_shifted_offset): Likewise.
(vstrhq_scatter_shifted_offset_p): Likewise.
(vstrwq_scatter_base): Likewise.
(vstrwq_scatter_base_p): Likewise.
(vstrwq_scatter_offset): Likewise.
(vstrwq_scatter_offset_p): Likewise.
(vstrwq_scatter_shifted_offset): Likewise.
(vstrwq_scatter_shifted_offset_p): Likewise.
(vstrdq_scatter_base_p): Likewise.
(vstrdq_scatter_base): Likewise.
(vstrdq_scatter_offset_p): Likewise.
(vstrdq_scatter_offset): Likewise.
(vstrdq_scatter_shifted_offset_p): Likewise.
(vstrdq_scatter_shifted_offset): Likewise.
* config/arm/arm_mve_builtins.def (STRSBS): Use builtin qualifier.
(STRSBS_P): Likewise.
(STRSBU): Likewise.
(STRSBU_P): Likewise.
(STRSS): Likewise.
(STRSS_P): Likewise.
(STRSU): Likewise.
(STRSU_P): Likewise.
* config/arm/constraints.md (Ri): Define.
* config/arm/mve.md (VSTRDSBQ): Define iterator.
(VSTRDSOQ): Likewise.
(VSTRDSSOQ): Likewise.
(VSTRWSOQ): Likewise.
(VSTRWSSOQ): Likewise.
(mve_vstrdq_scatter_base_p_<supf>v2di): Define RTL pattern.
(mve_vstrdq_scatter_base_<supf>v2di): Likewise.
(mve_vstrdq_scatter_offset_p_<supf>v2di): Likewise.
(mve_vstrdq_scatter_offset_<supf>v2di): Likewise.
(mve_vstrdq_scatter_shifted_offset_p_<supf>v2di): Likewise.
(mve_vstrdq_scatter_shifted_offset_<supf>v2di): Likewise.
(mve_vstrhq_scatter_offset_fv8hf): Likewise.
(mve_vstrhq_scatter_offset_p_fv8hf): Likewise.
(mve_vstrhq_scatter_shifted_offset_fv8hf): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_fv8hf): Likewise.
(mve_vstrwq_scatter_base_fv4sf): Likewise.
(mve_vstrwq_scatter_base_p_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_p_fv4sf): Likewise.
(mve_vstrwq_scatter_offset_p_<supf>v4si): Likewise.
(mve_vstrwq_scatter_offset_<supf>v4si): Likewise.
(mve_vstrwq_scatter_shifted_offset_fv4sf): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_fv4sf): Likewise.
(mve_vstrwq_scatter_shifted_offset_p_<supf>v4si): Likewise.
(mve_vstrwq_scatter_shifted_offset_<supf>v4si): Likewise.
* config/arm/predicates.md (Ri): Define predicate to check immediate
is the range +/-1016 and multiple of 8.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_s64.c: New test.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_p_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_base_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_p_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_offset_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_s64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_p_u64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_s64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrdq_scatter_shifted_offset_u64.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_f16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_f16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_f32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_f32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_p_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_shifted_offset_u32.c:
Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 19:08:29 +0000 (19:08 +0000)]
[ARM][GCC][7/5x]: MVE store intrinsics which stores byte,half word or word to memory.
This patch supports the following MVE ACLE store intrinsics which stores a byte, halfword, or word to memory.
vst1q_f32, vst1q_f16, vst1q_s8, vst1q_s32, vst1q_s16, vst1q_u8, vst1q_u32, vst1q_u16, vstrhq_f16, vstrhq_scatter_offset_s32, vstrhq_scatter_offset_s16, vstrhq_scatter_offset_u32, vstrhq_scatter_offset_u16, vstrhq_scatter_offset_p_s32, vstrhq_scatter_offset_p_s16, vstrhq_scatter_offset_p_u32, vstrhq_scatter_offset_p_u16, vstrhq_scatter_shifted_offset_s32,
vstrhq_scatter_shifted_offset_s16, vstrhq_scatter_shifted_offset_u32,
vstrhq_scatter_shifted_offset_u16, vstrhq_scatter_shifted_offset_p_s32,
vstrhq_scatter_shifted_offset_p_s16, vstrhq_scatter_shifted_offset_p_u32,
vstrhq_scatter_shifted_offset_p_u16, vstrhq_s32, vstrhq_s16, vstrhq_u32, vstrhq_u16, vstrhq_p_f16, vstrhq_p_s32, vstrhq_p_s16, vstrhq_p_u32, vstrhq_p_u16, vstrwq_f32, vstrwq_s32, vstrwq_u32, vstrwq_p_f32, vstrwq_p_s32, vstrwq_p_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vst1q_f32): Define macro.
(vst1q_f16): Likewise.
(vst1q_s8): Likewise.
(vst1q_s32): Likewise.
(vst1q_s16): Likewise.
(vst1q_u8): Likewise.
(vst1q_u32): Likewise.
(vst1q_u16): Likewise.
(vstrhq_f16): Likewise.
(vstrhq_scatter_offset_s32): Likewise.
(vstrhq_scatter_offset_s16): Likewise.
(vstrhq_scatter_offset_u32): Likewise.
(vstrhq_scatter_offset_u16): Likewise.
(vstrhq_scatter_offset_p_s32): Likewise.
(vstrhq_scatter_offset_p_s16): Likewise.
(vstrhq_scatter_offset_p_u32): Likewise.
(vstrhq_scatter_offset_p_u16): Likewise.
(vstrhq_scatter_shifted_offset_s32): Likewise.
(vstrhq_scatter_shifted_offset_s16): Likewise.
(vstrhq_scatter_shifted_offset_u32): Likewise.
(vstrhq_scatter_shifted_offset_u16): Likewise.
(vstrhq_scatter_shifted_offset_p_s32): Likewise.
(vstrhq_scatter_shifted_offset_p_s16): Likewise.
(vstrhq_scatter_shifted_offset_p_u32): Likewise.
(vstrhq_scatter_shifted_offset_p_u16): Likewise.
(vstrhq_s32): Likewise.
(vstrhq_s16): Likewise.
(vstrhq_u32): Likewise.
(vstrhq_u16): Likewise.
(vstrhq_p_f16): Likewise.
(vstrhq_p_s32): Likewise.
(vstrhq_p_s16): Likewise.
(vstrhq_p_u32): Likewise.
(vstrhq_p_u16): Likewise.
(vstrwq_f32): Likewise.
(vstrwq_s32): Likewise.
(vstrwq_u32): Likewise.
(vstrwq_p_f32): Likewise.
(vstrwq_p_s32): Likewise.
(vstrwq_p_u32): Likewise.
(__arm_vst1q_s8): Define intrinsic.
(__arm_vst1q_s32): Likewise.
(__arm_vst1q_s16): Likewise.
(__arm_vst1q_u8): Likewise.
(__arm_vst1q_u32): Likewise.
(__arm_vst1q_u16): Likewise.
(__arm_vstrhq_scatter_offset_s32): Likewise.
(__arm_vstrhq_scatter_offset_s16): Likewise.
(__arm_vstrhq_scatter_offset_u32): Likewise.
(__arm_vstrhq_scatter_offset_u16): Likewise.
(__arm_vstrhq_scatter_offset_p_s32): Likewise.
(__arm_vstrhq_scatter_offset_p_s16): Likewise.
(__arm_vstrhq_scatter_offset_p_u32): Likewise.
(__arm_vstrhq_scatter_offset_p_u16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_s32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_s16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_u32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_u16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_s32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_s16): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_u32): Likewise.
(__arm_vstrhq_scatter_shifted_offset_p_u16): Likewise.
(__arm_vstrhq_s32): Likewise.
(__arm_vstrhq_s16): Likewise.
(__arm_vstrhq_u32): Likewise.
(__arm_vstrhq_u16): Likewise.
(__arm_vstrhq_p_s32): Likewise.
(__arm_vstrhq_p_s16): Likewise.
(__arm_vstrhq_p_u32): Likewise.
(__arm_vstrhq_p_u16): Likewise.
(__arm_vstrwq_s32): Likewise.
(__arm_vstrwq_u32): Likewise.
(__arm_vstrwq_p_s32): Likewise.
(__arm_vstrwq_p_u32): Likewise.
(__arm_vstrwq_p_f32): Likewise.
(__arm_vstrwq_f32): Likewise.
(__arm_vst1q_f32): Likewise.
(__arm_vst1q_f16): Likewise.
(__arm_vstrhq_f16): Likewise.
(__arm_vstrhq_p_f16): Likewise.
(vst1q): Define polymorphic variant.
(vstrhq): Likewise.
(vstrhq_p): Likewise.
(vstrhq_scatter_offset_p): Likewise.
(vstrhq_scatter_offset): Likewise.
(vstrhq_scatter_shifted_offset_p): Likewise.
(vstrhq_scatter_shifted_offset): Likewise.
(vstrwq_p): Likewise.
(vstrwq): Likewise.
* config/arm/arm_mve_builtins.def (STRS): Use builtin qualifier.
(STRS_P): Likewise.
(STRSS): Likewise.
(STRSS_P): Likewise.
(STRSU): Likewise.
(STRSU_P): Likewise.
(STRU): Likewise.
(STRU_P): Likewise.
* config/arm/mve.md (VST1Q): Define iterator.
(VSTRHSOQ): Likewise.
(VSTRHSSOQ): Likewise.
(VSTRHQ): Likewise.
(VSTRWQ): Likewise.
(mve_vstrhq_fv8hf): Define RTL pattern.
(mve_vstrhq_p_fv8hf): Likewise.
(mve_vstrhq_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_offset_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_offset_<supf><mode>): Likewise.
(mve_vstrhq_scatter_shifted_offset_p_<supf><mode>): Likewise.
(mve_vstrhq_scatter_shifted_offset_<supf><mode>): Likewise.
(mve_vstrhq_<supf><mode>): Likewise.
(mve_vstrwq_fv4sf): Likewise.
(mve_vstrwq_p_fv4sf): Likewise.
(mve_vstrwq_p_<supf>v4si): Likewise.
(mve_vstrwq_<supf>v4si): Likewise.
(mve_vst1q_f<mode>): Define expand.
(mve_vst1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vst1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vst1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vst1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_p_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_s32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u16.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_scatter_shifted_offset_u32.c:
Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:58:48 +0000 (18:58 +0000)]
[ARM][GCC][6/5x]: Remaining MVE load intrinsics which loads half word and word or double word from memory.
This patch supports the following Remaining MVE ACLE load intrinsics which load an halfword,
word or double word from memory.
vldrdq_gather_base_s64, vldrdq_gather_base_u64, vldrdq_gather_base_z_s64,
vldrdq_gather_base_z_u64, vldrdq_gather_offset_s64, vldrdq_gather_offset_u64,
vldrdq_gather_offset_z_s64, vldrdq_gather_offset_z_u64, vldrdq_gather_shifted_offset_s64,
vldrdq_gather_shifted_offset_u64, vldrdq_gather_shifted_offset_z_s64,
vldrdq_gather_shifted_offset_z_u64, vldrhq_gather_offset_f16, vldrhq_gather_offset_z_f16,
vldrhq_gather_shifted_offset_f16, vldrhq_gather_shifted_offset_z_f16, vldrwq_gather_base_f32,
vldrwq_gather_base_z_f32, vldrwq_gather_offset_f32, vldrwq_gather_offset_s32,
vldrwq_gather_offset_u32, vldrwq_gather_offset_z_f32, vldrwq_gather_offset_z_s32,
vldrwq_gather_offset_z_u32, vldrwq_gather_shifted_offset_f32, vldrwq_gather_shifted_offset_s32,
vldrwq_gather_shifted_offset_u32, vldrwq_gather_shifted_offset_z_f32,
vldrwq_gather_shifted_offset_z_s32, vldrwq_gather_shifted_offset_z_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vld1q_s8): Define macro.
(vld1q_s32): Likewise.
(vld1q_s16): Likewise.
(vld1q_u8): Likewise.
(vld1q_u32): Likewise.
(vld1q_u16): Likewise.
(vldrhq_gather_offset_s32): Likewise.
(vldrhq_gather_offset_s16): Likewise.
(vldrhq_gather_offset_u32): Likewise.
(vldrhq_gather_offset_u16): Likewise.
(vldrhq_gather_offset_z_s32): Likewise.
(vldrhq_gather_offset_z_s16): Likewise.
(vldrhq_gather_offset_z_u32): Likewise.
(vldrhq_gather_offset_z_u16): Likewise.
(vldrhq_gather_shifted_offset_s32): Likewise.
(vldrhq_gather_shifted_offset_s16): Likewise.
(vldrhq_gather_shifted_offset_u32): Likewise.
(vldrhq_gather_shifted_offset_u16): Likewise.
(vldrhq_gather_shifted_offset_z_s32): Likewise.
(vldrhq_gather_shifted_offset_z_s16): Likewise.
(vldrhq_gather_shifted_offset_z_u32): Likewise.
(vldrhq_gather_shifted_offset_z_u16): Likewise.
(vldrhq_s32): Likewise.
(vldrhq_s16): Likewise.
(vldrhq_u32): Likewise.
(vldrhq_u16): Likewise.
(vldrhq_z_s32): Likewise.
(vldrhq_z_s16): Likewise.
(vldrhq_z_u32): Likewise.
(vldrhq_z_u16): Likewise.
(vldrwq_s32): Likewise.
(vldrwq_u32): Likewise.
(vldrwq_z_s32): Likewise.
(vldrwq_z_u32): Likewise.
(vld1q_f32): Likewise.
(vld1q_f16): Likewise.
(vldrhq_f16): Likewise.
(vldrhq_z_f16): Likewise.
(vldrwq_f32): Likewise.
(vldrwq_z_f32): Likewise.
(__arm_vld1q_s8): Define intrinsic.
(__arm_vld1q_s32): Likewise.
(__arm_vld1q_s16): Likewise.
(__arm_vld1q_u8): Likewise.
(__arm_vld1q_u32): Likewise.
(__arm_vld1q_u16): Likewise.
(__arm_vldrhq_gather_offset_s32): Likewise.
(__arm_vldrhq_gather_offset_s16): Likewise.
(__arm_vldrhq_gather_offset_u32): Likewise.
(__arm_vldrhq_gather_offset_u16): Likewise.
(__arm_vldrhq_gather_offset_z_s32): Likewise.
(__arm_vldrhq_gather_offset_z_s16): Likewise.
(__arm_vldrhq_gather_offset_z_u32): Likewise.
(__arm_vldrhq_gather_offset_z_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
(__arm_vldrhq_s32): Likewise.
(__arm_vldrhq_s16): Likewise.
(__arm_vldrhq_u32): Likewise.
(__arm_vldrhq_u16): Likewise.
(__arm_vldrhq_z_s32): Likewise.
(__arm_vldrhq_z_s16): Likewise.
(__arm_vldrhq_z_u32): Likewise.
(__arm_vldrhq_z_u16): Likewise.
(__arm_vldrwq_s32): Likewise.
(__arm_vldrwq_u32): Likewise.
(__arm_vldrwq_z_s32): Likewise.
(__arm_vldrwq_z_u32): Likewise.
(__arm_vld1q_f32): Likewise.
(__arm_vld1q_f16): Likewise.
(__arm_vldrwq_f32): Likewise.
(__arm_vldrwq_z_f32): Likewise.
(__arm_vldrhq_z_f16): Likewise.
(__arm_vldrhq_f16): Likewise.
(vld1q): Define polymorphic variant.
(vldrhq_gather_offset): Likewise.
(vldrhq_gather_offset_z): Likewise.
(vldrhq_gather_shifted_offset): Likewise.
(vldrhq_gather_shifted_offset_z): Likewise.
* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
(LDRS): Likewise.
(LDRU_Z): Likewise.
(LDRS_Z): Likewise.
(LDRGU_Z): Likewise.
(LDRGU): Likewise.
(LDRGS_Z): Likewise.
(LDRGS): Likewise.
* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
(V_sz_elem1): Likewise.
(VLD1Q): Define iterator.
(VLDRHGOQ): Likewise.
(VLDRHGSOQ): Likewise.
(VLDRHQ): Likewise.
(VLDRWQ): Likewise.
(mve_vldrhq_fv8hf): Define RTL pattern.
(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_<supf><mode>): Likewise.
(mve_vldrhq_z_fv8hf): Likewise.
(mve_vldrhq_z_<supf><mode>): Likewise.
(mve_vldrwq_fv4sf): Likewise.
(mve_vldrwq_<supf>v4si): Likewise.
(mve_vldrwq_z_fv4sf): Likewise.
(mve_vldrwq_z_<supf>v4si): Likewise.
(mve_vld1q_f<mode>): Define RTL expand pattern.
(mve_vld1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:48:05 +0000 (18:48 +0000)]
[ARM][GCC][5/5x]: MVE ACLE load intrinsics which load a byte, halfword, or word from memory.
This patch supports the following MVE ACLE load intrinsics which load a byte, halfword,
or word from memory.
vld1q_s8, vld1q_s32, vld1q_s16, vld1q_u8, vld1q_u32, vld1q_u16, vldrhq_gather_offset_s32,
vldrhq_gather_offset_s16, vldrhq_gather_offset_u32, vldrhq_gather_offset_u16,
vldrhq_gather_offset_z_s32, vldrhq_gather_offset_z_s16, vldrhq_gather_offset_z_u32,
vldrhq_gather_offset_z_u16, vldrhq_gather_shifted_offset_s32,vldrwq_f32, vldrwq_z_f32,
vldrhq_gather_shifted_offset_s16, vldrhq_gather_shifted_offset_u32,
vldrhq_gather_shifted_offset_u16, vldrhq_gather_shifted_offset_z_s32,
vldrhq_gather_shifted_offset_z_s16, vldrhq_gather_shifted_offset_z_u32,
vldrhq_gather_shifted_offset_z_u16, vldrhq_s32, vldrhq_s16, vldrhq_u32, vldrhq_u16,
vldrhq_z_s32, vldrhq_z_s16, vldrhq_z_u32, vldrhq_z_u16, vldrwq_s32, vldrwq_u32,
vldrwq_z_s32, vldrwq_z_u32, vld1q_f32, vld1q_f16, vldrhq_f16, vldrhq_z_f16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vld1q_s8): Define macro.
(vld1q_s32): Likewise.
(vld1q_s16): Likewise.
(vld1q_u8): Likewise.
(vld1q_u32): Likewise.
(vld1q_u16): Likewise.
(vldrhq_gather_offset_s32): Likewise.
(vldrhq_gather_offset_s16): Likewise.
(vldrhq_gather_offset_u32): Likewise.
(vldrhq_gather_offset_u16): Likewise.
(vldrhq_gather_offset_z_s32): Likewise.
(vldrhq_gather_offset_z_s16): Likewise.
(vldrhq_gather_offset_z_u32): Likewise.
(vldrhq_gather_offset_z_u16): Likewise.
(vldrhq_gather_shifted_offset_s32): Likewise.
(vldrhq_gather_shifted_offset_s16): Likewise.
(vldrhq_gather_shifted_offset_u32): Likewise.
(vldrhq_gather_shifted_offset_u16): Likewise.
(vldrhq_gather_shifted_offset_z_s32): Likewise.
(vldrhq_gather_shifted_offset_z_s16): Likewise.
(vldrhq_gather_shifted_offset_z_u32): Likewise.
(vldrhq_gather_shifted_offset_z_u16): Likewise.
(vldrhq_s32): Likewise.
(vldrhq_s16): Likewise.
(vldrhq_u32): Likewise.
(vldrhq_u16): Likewise.
(vldrhq_z_s32): Likewise.
(vldrhq_z_s16): Likewise.
(vldrhq_z_u32): Likewise.
(vldrhq_z_u16): Likewise.
(vldrwq_s32): Likewise.
(vldrwq_u32): Likewise.
(vldrwq_z_s32): Likewise.
(vldrwq_z_u32): Likewise.
(vld1q_f32): Likewise.
(vld1q_f16): Likewise.
(vldrhq_f16): Likewise.
(vldrhq_z_f16): Likewise.
(vldrwq_f32): Likewise.
(vldrwq_z_f32): Likewise.
(__arm_vld1q_s8): Define intrinsic.
(__arm_vld1q_s32): Likewise.
(__arm_vld1q_s16): Likewise.
(__arm_vld1q_u8): Likewise.
(__arm_vld1q_u32): Likewise.
(__arm_vld1q_u16): Likewise.
(__arm_vldrhq_gather_offset_s32): Likewise.
(__arm_vldrhq_gather_offset_s16): Likewise.
(__arm_vldrhq_gather_offset_u32): Likewise.
(__arm_vldrhq_gather_offset_u16): Likewise.
(__arm_vldrhq_gather_offset_z_s32): Likewise.
(__arm_vldrhq_gather_offset_z_s16): Likewise.
(__arm_vldrhq_gather_offset_z_u32): Likewise.
(__arm_vldrhq_gather_offset_z_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_u16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_s16): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u32): Likewise.
(__arm_vldrhq_gather_shifted_offset_z_u16): Likewise.
(__arm_vldrhq_s32): Likewise.
(__arm_vldrhq_s16): Likewise.
(__arm_vldrhq_u32): Likewise.
(__arm_vldrhq_u16): Likewise.
(__arm_vldrhq_z_s32): Likewise.
(__arm_vldrhq_z_s16): Likewise.
(__arm_vldrhq_z_u32): Likewise.
(__arm_vldrhq_z_u16): Likewise.
(__arm_vldrwq_s32): Likewise.
(__arm_vldrwq_u32): Likewise.
(__arm_vldrwq_z_s32): Likewise.
(__arm_vldrwq_z_u32): Likewise.
(__arm_vld1q_f32): Likewise.
(__arm_vld1q_f16): Likewise.
(__arm_vldrwq_f32): Likewise.
(__arm_vldrwq_z_f32): Likewise.
(__arm_vldrhq_z_f16): Likewise.
(__arm_vldrhq_f16): Likewise.
(vld1q): Define polymorphic variant.
(vldrhq_gather_offset): Likewise.
(vldrhq_gather_offset_z): Likewise.
(vldrhq_gather_shifted_offset): Likewise.
(vldrhq_gather_shifted_offset_z): Likewise.
* config/arm/arm_mve_builtins.def (LDRU): Use builtin qualifier.
(LDRS): Likewise.
(LDRU_Z): Likewise.
(LDRS_Z): Likewise.
(LDRGU_Z): Likewise.
(LDRGU): Likewise.
(LDRGS_Z): Likewise.
(LDRGS): Likewise.
* config/arm/mve.md (MVE_H_ELEM): Define mode iterator.
(V_sz_elem1): Likewise.
(VLD1Q): Define iterator.
(VLDRHGOQ): Likewise.
(VLDRHGSOQ): Likewise.
(VLDRHQ): Likewise.
(VLDRWQ): Likewise.
(mve_vldrhq_fv8hf): Define RTL pattern.
(mve_vldrhq_gather_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_<supf><mode>): Likewise.
(mve_vldrhq_gather_shifted_offset_z_<supf><mode>): Likewise.
(mve_vldrhq_<supf><mode>): Likewise.
(mve_vldrhq_z_fv8hf): Likewise.
(mve_vldrhq_z_<supf><mode>): Likewise.
(mve_vldrwq_fv4sf): Likewise.
(mve_vldrwq_<supf>v4si): Likewise.
(mve_vldrwq_z_fv4sf): Likewise.
(mve_vldrwq_z_<supf>v4si): Likewise.
(mve_vld1q_f<mode>): Define RTL expand pattern.
(mve_vld1q_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vld1q_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vld1q_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vld1q_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_gather_shifted_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrhq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:35:17 +0000 (18:35 +0000)]
[ARM][GCC][4/5x]: MVE load intrinsics with zero(_z) suffix.
This patch supports the following MVE ACLE load intrinsics with zero(_z) suffix.
* ``_z`` (zero) which indicates false-predicated lanes are filled with zeroes, these are only used for load instructions.
vldrbq_gather_offset_z_s16, vldrbq_gather_offset_z_u8, vldrbq_gather_offset_z_s32, vldrbq_gather_offset_z_u16, vldrbq_gather_offset_z_u32, vldrbq_gather_offset_z_s8, vldrbq_z_s16, vldrbq_z_u8, vldrbq_z_s8, vldrbq_z_s32, vldrbq_z_u16, vldrbq_z_u32, vldrwq_gather_base_z_u32, vldrwq_gather_base_z_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (LDRGBS_Z_QUALIFIERS): Define builtin
qualifier.
(LDRGBU_Z_QUALIFIERS): Likewise.
(LDRGS_Z_QUALIFIERS): Likewise.
(LDRGU_Z_QUALIFIERS): Likewise.
(LDRS_Z_QUALIFIERS): Likewise.
(LDRU_Z_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vldrbq_gather_offset_z_s16): Define macro.
(vldrbq_gather_offset_z_u8): Likewise.
(vldrbq_gather_offset_z_s32): Likewise.
(vldrbq_gather_offset_z_u16): Likewise.
(vldrbq_gather_offset_z_u32): Likewise.
(vldrbq_gather_offset_z_s8): Likewise.
(vldrbq_z_s16): Likewise.
(vldrbq_z_u8): Likewise.
(vldrbq_z_s8): Likewise.
(vldrbq_z_s32): Likewise.
(vldrbq_z_u16): Likewise.
(vldrbq_z_u32): Likewise.
(vldrwq_gather_base_z_u32): Likewise.
(vldrwq_gather_base_z_s32): Likewise.
(__arm_vldrbq_gather_offset_z_s8): Define intrinsic.
(__arm_vldrbq_gather_offset_z_s32): Likewise.
(__arm_vldrbq_gather_offset_z_s16): Likewise.
(__arm_vldrbq_gather_offset_z_u8): Likewise.
(__arm_vldrbq_gather_offset_z_u32): Likewise.
(__arm_vldrbq_gather_offset_z_u16): Likewise.
(__arm_vldrbq_z_s8): Likewise.
(__arm_vldrbq_z_s32): Likewise.
(__arm_vldrbq_z_s16): Likewise.
(__arm_vldrbq_z_u8): Likewise.
(__arm_vldrbq_z_u32): Likewise.
(__arm_vldrbq_z_u16): Likewise.
(__arm_vldrwq_gather_base_z_s32): Likewise.
(__arm_vldrwq_gather_base_z_u32): Likewise.
(vldrbq_gather_offset_z): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (LDRGBS_Z_QUALIFIERS): Use builtin
qualifier.
(LDRGBU_Z_QUALIFIERS): Likewise.
(LDRGS_Z_QUALIFIERS): Likewise.
(LDRGU_Z_QUALIFIERS): Likewise.
(LDRS_Z_QUALIFIERS): Likewise.
(LDRU_Z_QUALIFIERS): Likewise.
* config/arm/mve.md (mve_vldrbq_gather_offset_z_<supf><mode>): Define
RTL pattern.
(mve_vldrbq_z_<supf><mode>): Likewise.
(mve_vldrwq_gather_base_z_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog: Likewise.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_z_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_z_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_z_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:22:21 +0000 (18:22 +0000)]
[ARM][GCC][3/5x]: MVE store intrinsics with predicated suffix.
This patch supports the following MVE ACLE store intrinsics with predicated suffix.
vstrbq_p_s8, vstrbq_p_s32, vstrbq_p_s16, vstrbq_p_u8, vstrbq_p_u32, vstrbq_p_u16, vstrbq_scatter_offset_p_s8, vstrbq_scatter_offset_p_s32, vstrbq_scatter_offset_p_s16, vstrbq_scatter_offset_p_u8, vstrbq_scatter_offset_p_u32, vstrbq_scatter_offset_p_u16, vstrwq_scatter_base_p_s32, vstrwq_scatter_base_p_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (STRS_P_QUALIFIERS): Define builtin
qualifier.
(STRU_P_QUALIFIERS): Likewise.
(STRSU_P_QUALIFIERS): Likewise.
(STRSS_P_QUALIFIERS): Likewise.
(STRSBS_P_QUALIFIERS): Likewise.
(STRSBU_P_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vstrbq_p_s8): Define macro.
(vstrbq_p_s32): Likewise.
(vstrbq_p_s16): Likewise.
(vstrbq_p_u8): Likewise.
(vstrbq_p_u32): Likewise.
(vstrbq_p_u16): Likewise.
(vstrbq_scatter_offset_p_s8): Likewise.
(vstrbq_scatter_offset_p_s32): Likewise.
(vstrbq_scatter_offset_p_s16): Likewise.
(vstrbq_scatter_offset_p_u8): Likewise.
(vstrbq_scatter_offset_p_u32): Likewise.
(vstrbq_scatter_offset_p_u16): Likewise.
(vstrwq_scatter_base_p_s32): Likewise.
(vstrwq_scatter_base_p_u32): Likewise.
(__arm_vstrbq_p_s8): Define intrinsic.
(__arm_vstrbq_p_s32): Likewise.
(__arm_vstrbq_p_s16): Likewise.
(__arm_vstrbq_p_u8): Likewise.
(__arm_vstrbq_p_u32): Likewise.
(__arm_vstrbq_p_u16): Likewise.
(__arm_vstrbq_scatter_offset_p_s8): Likewise.
(__arm_vstrbq_scatter_offset_p_s32): Likewise.
(__arm_vstrbq_scatter_offset_p_s16): Likewise.
(__arm_vstrbq_scatter_offset_p_u8): Likewise.
(__arm_vstrbq_scatter_offset_p_u32): Likewise.
(__arm_vstrbq_scatter_offset_p_u16): Likewise.
(__arm_vstrwq_scatter_base_p_s32): Likewise.
(__arm_vstrwq_scatter_base_p_u32): Likewise.
(vstrbq_p): Define polymorphic variant.
(vstrbq_scatter_offset_p): Likewise.
(vstrwq_scatter_base_p): Likewise.
* config/arm/arm_mve_builtins.def (STRS_P_QUALIFIERS): Use builtin
qualifier.
(STRU_P_QUALIFIERS): Likewise.
(STRSU_P_QUALIFIERS): Likewise.
(STRSS_P_QUALIFIERS): Likewise.
(STRSBS_P_QUALIFIERS): Likewise.
(STRSBU_P_QUALIFIERS): Likewise.
* config/arm/mve.md (mve_vstrbq_scatter_offset_p_<supf><mode>): Define
RTL pattern.
(mve_vstrwq_scatter_base_p_<supf>v4si): Likewise.
(mve_vstrbq_p_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrbq_p_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vstrbq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_p_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 18:13:53 +0000 (18:13 +0000)]
[ARM][GCC][2/5x]: MVE load intrinsics.
This patch supports the following MVE ACLE load intrinsics.
vldrbq_gather_offset_u8, vldrbq_gather_offset_s8, vldrbq_s8, vldrbq_u8, vldrbq_gather_offset_u16, vldrbq_gather_offset_s16, vldrbq_s16, vldrbq_u16, vldrbq_gather_offset_u32, vldrbq_gather_offset_s32, vldrbq_s32, vldrbq_u32, vldrwq_gather_base_s32, vldrwq_gather_base_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (LDRGU_QUALIFIERS): Define builtin
qualifier.
(LDRGS_QUALIFIERS): Likewise.
(LDRS_QUALIFIERS): Likewise.
(LDRU_QUALIFIERS): Likewise.
(LDRGBS_QUALIFIERS): Likewise.
(LDRGBU_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vldrbq_gather_offset_u8): Define macro.
(vldrbq_gather_offset_s8): Likewise.
(vldrbq_s8): Likewise.
(vldrbq_u8): Likewise.
(vldrbq_gather_offset_u16): Likewise.
(vldrbq_gather_offset_s16): Likewise.
(vldrbq_s16): Likewise.
(vldrbq_u16): Likewise.
(vldrbq_gather_offset_u32): Likewise.
(vldrbq_gather_offset_s32): Likewise.
(vldrbq_s32): Likewise.
(vldrbq_u32): Likewise.
(vldrwq_gather_base_s32): Likewise.
(vldrwq_gather_base_u32): Likewise.
(__arm_vldrbq_gather_offset_u8): Define intrinsic.
(__arm_vldrbq_gather_offset_s8): Likewise.
(__arm_vldrbq_s8): Likewise.
(__arm_vldrbq_u8): Likewise.
(__arm_vldrbq_gather_offset_u16): Likewise.
(__arm_vldrbq_gather_offset_s16): Likewise.
(__arm_vldrbq_s16): Likewise.
(__arm_vldrbq_u16): Likewise.
(__arm_vldrbq_gather_offset_u32): Likewise.
(__arm_vldrbq_gather_offset_s32): Likewise.
(__arm_vldrbq_s32): Likewise.
(__arm_vldrbq_u32): Likewise.
(__arm_vldrwq_gather_base_s32): Likewise.
(__arm_vldrwq_gather_base_u32): Likewise.
(vldrbq_gather_offset): Define polymorphic variant.
* config/arm/arm_mve_builtins.def (LDRGU_QUALIFIERS): Use builtin
qualifier.
(LDRGS_QUALIFIERS): Likewise.
(LDRS_QUALIFIERS): Likewise.
(LDRU_QUALIFIERS): Likewise.
(LDRGBS_QUALIFIERS): Likewise.
(LDRGBU_QUALIFIERS): Likewise.
* config/arm/mve.md (VLDRBGOQ): Define iterator.
(VLDRBQ): Likewise.
(VLDRWGBQ): Likewise.
(mve_vldrbq_gather_offset_<supf><mode>): Define RTL pattern.
(mve_vldrbq_<supf><mode>): Likewise.
(mve_vldrwq_gather_base_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_gather_offset_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrbq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vldrwq_gather_base_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:54:30 +0000 (17:54 +0000)]
[ARM][GCC][1/5x]: MVE store intrinsics.
This patch supports the following MVE ACLE store intrinsics.
vstrbq_scatter_offset_s8, vstrbq_scatter_offset_s32, vstrbq_scatter_offset_s16, vstrbq_scatter_offset_u8, vstrbq_scatter_offset_u32, vstrbq_scatter_offset_u16, vstrbq_s8, vstrbq_s32, vstrbq_s16, vstrbq_u8, vstrbq_u32, vstrbq_u16, vstrwq_scatter_base_s32, vstrwq_scatter_base_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (STRS_QUALIFIERS): Define builtin qualifier.
(STRU_QUALIFIERS): Likewise.
(STRSS_QUALIFIERS): Likewise.
(STRSU_QUALIFIERS): Likewise.
(STRSBS_QUALIFIERS): Likewise.
(STRSBU_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vstrbq_s8): Define macro.
(vstrbq_u8): Likewise.
(vstrbq_u16): Likewise.
(vstrbq_scatter_offset_s8): Likewise.
(vstrbq_scatter_offset_u8): Likewise.
(vstrbq_scatter_offset_u16): Likewise.
(vstrbq_s16): Likewise.
(vstrbq_u32): Likewise.
(vstrbq_scatter_offset_s16): Likewise.
(vstrbq_scatter_offset_u32): Likewise.
(vstrbq_s32): Likewise.
(vstrbq_scatter_offset_s32): Likewise.
(vstrwq_scatter_base_s32): Likewise.
(vstrwq_scatter_base_u32): Likewise.
(__arm_vstrbq_scatter_offset_s8): Define intrinsic.
(__arm_vstrbq_scatter_offset_s32): Likewise.
(__arm_vstrbq_scatter_offset_s16): Likewise.
(__arm_vstrbq_scatter_offset_u8): Likewise.
(__arm_vstrbq_scatter_offset_u32): Likewise.
(__arm_vstrbq_scatter_offset_u16): Likewise.
(__arm_vstrbq_s8): Likewise.
(__arm_vstrbq_s32): Likewise.
(__arm_vstrbq_s16): Likewise.
(__arm_vstrbq_u8): Likewise.
(__arm_vstrbq_u32): Likewise.
(__arm_vstrbq_u16): Likewise.
(__arm_vstrwq_scatter_base_s32): Likewise.
(__arm_vstrwq_scatter_base_u32): Likewise.
(vstrbq): Define polymorphic variant.
(vstrbq_scatter_offset): Likewise.
(vstrwq_scatter_base): Likewise.
* config/arm/arm_mve_builtins.def (STRS_QUALIFIERS): Use builtin
qualifier.
(STRU_QUALIFIERS): Likewise.
(STRSS_QUALIFIERS): Likewise.
(STRSU_QUALIFIERS): Likewise.
(STRSBS_QUALIFIERS): Likewise.
(STRSBU_QUALIFIERS): Likewise.
* config/arm/mve.md (MVE_B_ELEM): Define mode attribute iterator.
(VSTRWSBQ): Define iterators.
(VSTRBSOQ): Likewise.
(VSTRBQ): Likewise.
(mve_vstrbq_<supf><mode>): Define RTL pattern.
(mve_vstrbq_scatter_offset_<supf><mode>): Likewise.
(mve_vstrwq_scatter_base_<supf>v4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vstrbq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vstrbq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_scatter_offset_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrbq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vstrwq_scatter_base_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:16:21 +0000 (17:16 +0000)]
[ARM][GCC][4/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vabdq_m_f32, vabdq_m_f16, vaddq_m_f32, vaddq_m_f16, vaddq_m_n_f32, vaddq_m_n_f16, vandq_m_f32, vandq_m_f16, vbicq_m_f32, vbicq_m_f16, vbrsrq_m_n_f32, vbrsrq_m_n_f16, vcaddq_rot270_m_f32, vcaddq_rot270_m_f16, vcaddq_rot90_m_f32, vcaddq_rot90_m_f16, vcmlaq_m_f32, vcmlaq_m_f16, vcmlaq_rot180_m_f32, vcmlaq_rot180_m_f16, vcmlaq_rot270_m_f32, vcmlaq_rot270_m_f16, vcmlaq_rot90_m_f32, vcmlaq_rot90_m_f16, vcmulq_m_f32, vcmulq_m_f16, vcmulq_rot180_m_f32, vcmulq_rot180_m_f16, vcmulq_rot270_m_f32, vcmulq_rot270_m_f16, vcmulq_rot90_m_f32, vcmulq_rot90_m_f16, vcvtq_m_n_s32_f32, vcvtq_m_n_s16_f16, vcvtq_m_n_u32_f32, vcvtq_m_n_u16_f16, veorq_m_f32, veorq_m_f16, vfmaq_m_f32, vfmaq_m_f16, vfmaq_m_n_f32, vfmaq_m_n_f16, vfmasq_m_n_f32, vfmasq_m_n_f16, vfmsq_m_f32, vfmsq_m_f16, vmaxnmq_m_f32, vmaxnmq_m_f16, vminnmq_m_f32, vminnmq_m_f16, vmulq_m_f32, vmulq_m_f16, vmulq_m_n_f32, vmulq_m_n_f16, vornq_m_f32, vornq_m_f16, vorrq_m_f32, vorrq_m_f16, vsubq_m_f32, vsubq_m_f16, vsubq_m_n_f32, vsubq_m_n_f16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vabdq_m_f32): Define macro.
(vabdq_m_f16): Likewise.
(vaddq_m_f32): Likewise.
(vaddq_m_f16): Likewise.
(vaddq_m_n_f32): Likewise.
(vaddq_m_n_f16): Likewise.
(vandq_m_f32): Likewise.
(vandq_m_f16): Likewise.
(vbicq_m_f32): Likewise.
(vbicq_m_f16): Likewise.
(vbrsrq_m_n_f32): Likewise.
(vbrsrq_m_n_f16): Likewise.
(vcaddq_rot270_m_f32): Likewise.
(vcaddq_rot270_m_f16): Likewise.
(vcaddq_rot90_m_f32): Likewise.
(vcaddq_rot90_m_f16): Likewise.
(vcmlaq_m_f32): Likewise.
(vcmlaq_m_f16): Likewise.
(vcmlaq_rot180_m_f32): Likewise.
(vcmlaq_rot180_m_f16): Likewise.
(vcmlaq_rot270_m_f32): Likewise.
(vcmlaq_rot270_m_f16): Likewise.
(vcmlaq_rot90_m_f32): Likewise.
(vcmlaq_rot90_m_f16): Likewise.
(vcmulq_m_f32): Likewise.
(vcmulq_m_f16): Likewise.
(vcmulq_rot180_m_f32): Likewise.
(vcmulq_rot180_m_f16): Likewise.
(vcmulq_rot270_m_f32): Likewise.
(vcmulq_rot270_m_f16): Likewise.
(vcmulq_rot90_m_f32): Likewise.
(vcmulq_rot90_m_f16): Likewise.
(vcvtq_m_n_s32_f32): Likewise.
(vcvtq_m_n_s16_f16): Likewise.
(vcvtq_m_n_u32_f32): Likewise.
(vcvtq_m_n_u16_f16): Likewise.
(veorq_m_f32): Likewise.
(veorq_m_f16): Likewise.
(vfmaq_m_f32): Likewise.
(vfmaq_m_f16): Likewise.
(vfmaq_m_n_f32): Likewise.
(vfmaq_m_n_f16): Likewise.
(vfmasq_m_n_f32): Likewise.
(vfmasq_m_n_f16): Likewise.
(vfmsq_m_f32): Likewise.
(vfmsq_m_f16): Likewise.
(vmaxnmq_m_f32): Likewise.
(vmaxnmq_m_f16): Likewise.
(vminnmq_m_f32): Likewise.
(vminnmq_m_f16): Likewise.
(vmulq_m_f32): Likewise.
(vmulq_m_f16): Likewise.
(vmulq_m_n_f32): Likewise.
(vmulq_m_n_f16): Likewise.
(vornq_m_f32): Likewise.
(vornq_m_f16): Likewise.
(vorrq_m_f32): Likewise.
(vorrq_m_f16): Likewise.
(vsubq_m_f32): Likewise.
(vsubq_m_f16): Likewise.
(vsubq_m_n_f32): Likewise.
(vsubq_m_n_f16): Likewise.
(__attribute__): Likewise.
(__arm_vabdq_m_f32): Likewise.
(__arm_vabdq_m_f16): Likewise.
(__arm_vaddq_m_f32): Likewise.
(__arm_vaddq_m_f16): Likewise.
(__arm_vaddq_m_n_f32): Likewise.
(__arm_vaddq_m_n_f16): Likewise.
(__arm_vandq_m_f32): Likewise.
(__arm_vandq_m_f16): Likewise.
(__arm_vbicq_m_f32): Likewise.
(__arm_vbicq_m_f16): Likewise.
(__arm_vbrsrq_m_n_f32): Likewise.
(__arm_vbrsrq_m_n_f16): Likewise.
(__arm_vcaddq_rot270_m_f32): Likewise.
(__arm_vcaddq_rot270_m_f16): Likewise.
(__arm_vcaddq_rot90_m_f32): Likewise.
(__arm_vcaddq_rot90_m_f16): Likewise.
(__arm_vcmlaq_m_f32): Likewise.
(__arm_vcmlaq_m_f16): Likewise.
(__arm_vcmlaq_rot180_m_f32): Likewise.
(__arm_vcmlaq_rot180_m_f16): Likewise.
(__arm_vcmlaq_rot270_m_f32): Likewise.
(__arm_vcmlaq_rot270_m_f16): Likewise.
(__arm_vcmlaq_rot90_m_f32): Likewise.
(__arm_vcmlaq_rot90_m_f16): Likewise.
(__arm_vcmulq_m_f32): Likewise.
(__arm_vcmulq_m_f16): Likewise.
(__arm_vcmulq_rot180_m_f32): Define intrinsic.
(__arm_vcmulq_rot180_m_f16): Likewise.
(__arm_vcmulq_rot270_m_f32): Likewise.
(__arm_vcmulq_rot270_m_f16): Likewise.
(__arm_vcmulq_rot90_m_f32): Likewise.
(__arm_vcmulq_rot90_m_f16): Likewise.
(__arm_vcvtq_m_n_s32_f32): Likewise.
(__arm_vcvtq_m_n_s16_f16): Likewise.
(__arm_vcvtq_m_n_u32_f32): Likewise.
(__arm_vcvtq_m_n_u16_f16): Likewise.
(__arm_veorq_m_f32): Likewise.
(__arm_veorq_m_f16): Likewise.
(__arm_vfmaq_m_f32): Likewise.
(__arm_vfmaq_m_f16): Likewise.
(__arm_vfmaq_m_n_f32): Likewise.
(__arm_vfmaq_m_n_f16): Likewise.
(__arm_vfmasq_m_n_f32): Likewise.
(__arm_vfmasq_m_n_f16): Likewise.
(__arm_vfmsq_m_f32): Likewise.
(__arm_vfmsq_m_f16): Likewise.
(__arm_vmaxnmq_m_f32): Likewise.
(__arm_vmaxnmq_m_f16): Likewise.
(__arm_vminnmq_m_f32): Likewise.
(__arm_vminnmq_m_f16): Likewise.
(__arm_vmulq_m_f32): Likewise.
(__arm_vmulq_m_f16): Likewise.
(__arm_vmulq_m_n_f32): Likewise.
(__arm_vmulq_m_n_f16): Likewise.
(__arm_vornq_m_f32): Likewise.
(__arm_vornq_m_f16): Likewise.
(__arm_vorrq_m_f32): Likewise.
(__arm_vorrq_m_f16): Likewise.
(__arm_vsubq_m_f32): Likewise.
(__arm_vsubq_m_f16): Likewise.
(__arm_vsubq_m_n_f32): Likewise.
(__arm_vsubq_m_n_f16): Likewise.
(vabdq_m): Define polymorphic variant.
(vaddq_m): Likewise.
(vaddq_m_n): Likewise.
(vandq_m): Likewise.
(vbicq_m): Likewise.
(vbrsrq_m_n): Likewise.
(vcaddq_rot270_m): Likewise.
(vcaddq_rot90_m): Likewise.
(vcmlaq_m): Likewise.
(vcmlaq_rot180_m): Likewise.
(vcmlaq_rot270_m): Likewise.
(vcmlaq_rot90_m): Likewise.
(vcmulq_m): Likewise.
(vcmulq_rot180_m): Likewise.
(vcmulq_rot270_m): Likewise.
(vcmulq_rot90_m): Likewise.
(veorq_m): Likewise.
(vfmaq_m): Likewise.
(vfmaq_m_n): Likewise.
(vfmasq_m_n): Likewise.
(vfmsq_m): Likewise.
(vmaxnmq_m): Likewise.
(vminnmq_m): Likewise.
(vmulq_m): Likewise.
(vmulq_m_n): Likewise.
(vornq_m): Likewise.
(vsubq_m): Likewise.
(vsubq_m_n): Likewise.
(vorrq_m): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE): Likewise.
* config/arm/mve.md (mve_vabdq_m_f<mode>): Define RTL pattern.
(mve_vaddq_m_f<mode>): Likewise.
(mve_vaddq_m_n_f<mode>): Likewise.
(mve_vandq_m_f<mode>): Likewise.
(mve_vbicq_m_f<mode>): Likewise.
(mve_vbrsrq_m_n_f<mode>): Likewise.
(mve_vcaddq_rot270_m_f<mode>): Likewise.
(mve_vcaddq_rot90_m_f<mode>): Likewise.
(mve_vcmlaq_m_f<mode>): Likewise.
(mve_vcmlaq_rot180_m_f<mode>): Likewise.
(mve_vcmlaq_rot270_m_f<mode>): Likewise.
(mve_vcmlaq_rot90_m_f<mode>): Likewise.
(mve_vcmulq_m_f<mode>): Likewise.
(mve_vcmulq_rot180_m_f<mode>): Likewise.
(mve_vcmulq_rot270_m_f<mode>): Likewise.
(mve_vcmulq_rot90_m_f<mode>): Likewise.
(mve_veorq_m_f<mode>): Likewise.
(mve_vfmaq_m_f<mode>): Likewise.
(mve_vfmaq_m_n_f<mode>): Likewise.
(mve_vfmasq_m_n_f<mode>): Likewise.
(mve_vfmsq_m_f<mode>): Likewise.
(mve_vmaxnmq_m_f<mode>): Likewise.
(mve_vminnmq_m_f<mode>): Likewise.
(mve_vmulq_m_f<mode>): Likewise.
(mve_vmulq_m_n_f<mode>): Likewise.
(mve_vornq_m_f<mode>): Likewise.
(mve_vorrq_m_f<mode>): Likewise.
(mve_vsubq_m_f<mode>): Likewise.
(mve_vsubq_m_n_f<mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_m_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot180_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot270_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmulq_rot90_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_f32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 17:06:58 +0000 (17:06 +0000)]
[ARM][GCC][3/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vmlaldavaq_p_s16, vmlaldavaq_p_s32, vmlaldavaq_p_u16, vmlaldavaq_p_u32, vmlaldavaxq_p_s16, vmlaldavaxq_p_s32, vmlaldavaxq_p_u16, vmlaldavaxq_p_u32, vmlsldavaq_p_s16, vmlsldavaq_p_s32, vmlsldavaxq_p_s16, vmlsldavaxq_p_s32, vmullbq_poly_m_p16, vmullbq_poly_m_p8, vmulltq_poly_m_p16, vmulltq_poly_m_p8, vqdmullbq_m_n_s16, vqdmullbq_m_n_s32, vqdmullbq_m_s16, vqdmullbq_m_s32, vqdmulltq_m_n_s16, vqdmulltq_m_n_s32, vqdmulltq_m_s16, vqdmulltq_m_s32, vqrshrnbq_m_n_s16, vqrshrnbq_m_n_s32, vqrshrnbq_m_n_u16, vqrshrnbq_m_n_u32, vqrshrntq_m_n_s16, vqrshrntq_m_n_s32, vqrshrntq_m_n_u16, vqrshrntq_m_n_u32, vqrshrunbq_m_n_s16, vqrshrunbq_m_n_s32, vqrshruntq_m_n_s16, vqrshruntq_m_n_s32, vqshrnbq_m_n_s16, vqshrnbq_m_n_s32, vqshrnbq_m_n_u16, vqshrnbq_m_n_u32, vqshrntq_m_n_s16, vqshrntq_m_n_s32, vqshrntq_m_n_u16, vqshrntq_m_n_u32, vqshrunbq_m_n_s16, vqshrunbq_m_n_s32, vqshruntq_m_n_s16, vqshruntq_m_n_s32, vrmlaldavhaq_p_s32, vrmlaldavhaq_p_u32, vrmlaldavhaxq_p_s32, vrmlsldavhaq_p_s32, vrmlsldavhaxq_p_s32, vrshrnbq_m_n_s16, vrshrnbq_m_n_s32, vrshrnbq_m_n_u16, vrshrnbq_m_n_u32, vrshrntq_m_n_s16, vrshrntq_m_n_s32, vrshrntq_m_n_u16, vrshrntq_m_n_u32, vshllbq_m_n_s16, vshllbq_m_n_s8, vshllbq_m_n_u16, vshllbq_m_n_u8, vshlltq_m_n_s16, vshlltq_m_n_s8, vshlltq_m_n_u16, vshlltq_m_n_u8, vshrnbq_m_n_s16, vshrnbq_m_n_s32, vshrnbq_m_n_u16, vshrnbq_m_n_u32, vshrntq_m_n_s16, vshrntq_m_n_s32, vshrntq_m_n_u16, vshrntq_m_n_u32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-protos.h (arm_mve_immediate_check):
* config/arm/arm.c (arm_mve_immediate_check): Define fuction to check
mode and interger value.
* config/arm/arm_mve.h (vmlaldavaq_p_s32): Define macro.
(vmlaldavaq_p_s16): Likewise.
(vmlaldavaq_p_u32): Likewise.
(vmlaldavaq_p_u16): Likewise.
(vmlaldavaxq_p_s32): Likewise.
(vmlaldavaxq_p_s16): Likewise.
(vmlaldavaxq_p_u32): Likewise.
(vmlaldavaxq_p_u16): Likewise.
(vmlsldavaq_p_s32): Likewise.
(vmlsldavaq_p_s16): Likewise.
(vmlsldavaxq_p_s32): Likewise.
(vmlsldavaxq_p_s16): Likewise.
(vmullbq_poly_m_p8): Likewise.
(vmullbq_poly_m_p16): Likewise.
(vmulltq_poly_m_p8): Likewise.
(vmulltq_poly_m_p16): Likewise.
(vqdmullbq_m_n_s32): Likewise.
(vqdmullbq_m_n_s16): Likewise.
(vqdmullbq_m_s32): Likewise.
(vqdmullbq_m_s16): Likewise.
(vqdmulltq_m_n_s32): Likewise.
(vqdmulltq_m_n_s16): Likewise.
(vqdmulltq_m_s32): Likewise.
(vqdmulltq_m_s16): Likewise.
(vqrshrnbq_m_n_s32): Likewise.
(vqrshrnbq_m_n_s16): Likewise.
(vqrshrnbq_m_n_u32): Likewise.
(vqrshrnbq_m_n_u16): Likewise.
(vqrshrntq_m_n_s32): Likewise.
(vqrshrntq_m_n_s16): Likewise.
(vqrshrntq_m_n_u32): Likewise.
(vqrshrntq_m_n_u16): Likewise.
(vqrshrunbq_m_n_s32): Likewise.
(vqrshrunbq_m_n_s16): Likewise.
(vqrshruntq_m_n_s32): Likewise.
(vqrshruntq_m_n_s16): Likewise.
(vqshrnbq_m_n_s32): Likewise.
(vqshrnbq_m_n_s16): Likewise.
(vqshrnbq_m_n_u32): Likewise.
(vqshrnbq_m_n_u16): Likewise.
(vqshrntq_m_n_s32): Likewise.
(vqshrntq_m_n_s16): Likewise.
(vqshrntq_m_n_u32): Likewise.
(vqshrntq_m_n_u16): Likewise.
(vqshrunbq_m_n_s32): Likewise.
(vqshrunbq_m_n_s16): Likewise.
(vqshruntq_m_n_s32): Likewise.
(vqshruntq_m_n_s16): Likewise.
(vrmlaldavhaq_p_s32): Likewise.
(vrmlaldavhaq_p_u32): Likewise.
(vrmlaldavhaxq_p_s32): Likewise.
(vrmlsldavhaq_p_s32): Likewise.
(vrmlsldavhaxq_p_s32): Likewise.
(vrshrnbq_m_n_s32): Likewise.
(vrshrnbq_m_n_s16): Likewise.
(vrshrnbq_m_n_u32): Likewise.
(vrshrnbq_m_n_u16): Likewise.
(vrshrntq_m_n_s32): Likewise.
(vrshrntq_m_n_s16): Likewise.
(vrshrntq_m_n_u32): Likewise.
(vrshrntq_m_n_u16): Likewise.
(vshllbq_m_n_s8): Likewise.
(vshllbq_m_n_s16): Likewise.
(vshllbq_m_n_u8): Likewise.
(vshllbq_m_n_u16): Likewise.
(vshlltq_m_n_s8): Likewise.
(vshlltq_m_n_s16): Likewise.
(vshlltq_m_n_u8): Likewise.
(vshlltq_m_n_u16): Likewise.
(vshrnbq_m_n_s32): Likewise.
(vshrnbq_m_n_s16): Likewise.
(vshrnbq_m_n_u32): Likewise.
(vshrnbq_m_n_u16): Likewise.
(vshrntq_m_n_s32): Likewise.
(vshrntq_m_n_s16): Likewise.
(vshrntq_m_n_u32): Likewise.
(vshrntq_m_n_u16): Likewise.
(__arm_vmlaldavaq_p_s32): Define intrinsic.
(__arm_vmlaldavaq_p_s16): Likewise.
(__arm_vmlaldavaq_p_u32): Likewise.
(__arm_vmlaldavaq_p_u16): Likewise.
(__arm_vmlaldavaxq_p_s32): Likewise.
(__arm_vmlaldavaxq_p_s16): Likewise.
(__arm_vmlaldavaxq_p_u32): Likewise.
(__arm_vmlaldavaxq_p_u16): Likewise.
(__arm_vmlsldavaq_p_s32): Likewise.
(__arm_vmlsldavaq_p_s16): Likewise.
(__arm_vmlsldavaxq_p_s32): Likewise.
(__arm_vmlsldavaxq_p_s16): Likewise.
(__arm_vmullbq_poly_m_p8): Likewise.
(__arm_vmullbq_poly_m_p16): Likewise.
(__arm_vmulltq_poly_m_p8): Likewise.
(__arm_vmulltq_poly_m_p16): Likewise.
(__arm_vqdmullbq_m_n_s32): Likewise.
(__arm_vqdmullbq_m_n_s16): Likewise.
(__arm_vqdmullbq_m_s32): Likewise.
(__arm_vqdmullbq_m_s16): Likewise.
(__arm_vqdmulltq_m_n_s32): Likewise.
(__arm_vqdmulltq_m_n_s16): Likewise.
(__arm_vqdmulltq_m_s32): Likewise.
(__arm_vqdmulltq_m_s16): Likewise.
(__arm_vqrshrnbq_m_n_s32): Likewise.
(__arm_vqrshrnbq_m_n_s16): Likewise.
(__arm_vqrshrnbq_m_n_u32): Likewise.
(__arm_vqrshrnbq_m_n_u16): Likewise.
(__arm_vqrshrntq_m_n_s32): Likewise.
(__arm_vqrshrntq_m_n_s16): Likewise.
(__arm_vqrshrntq_m_n_u32): Likewise.
(__arm_vqrshrntq_m_n_u16): Likewise.
(__arm_vqrshrunbq_m_n_s32): Likewise.
(__arm_vqrshrunbq_m_n_s16): Likewise.
(__arm_vqrshruntq_m_n_s32): Likewise.
(__arm_vqrshruntq_m_n_s16): Likewise.
(__arm_vqshrnbq_m_n_s32): Likewise.
(__arm_vqshrnbq_m_n_s16): Likewise.
(__arm_vqshrnbq_m_n_u32): Likewise.
(__arm_vqshrnbq_m_n_u16): Likewise.
(__arm_vqshrntq_m_n_s32): Likewise.
(__arm_vqshrntq_m_n_s16): Likewise.
(__arm_vqshrntq_m_n_u32): Likewise.
(__arm_vqshrntq_m_n_u16): Likewise.
(__arm_vqshrunbq_m_n_s32): Likewise.
(__arm_vqshrunbq_m_n_s16): Likewise.
(__arm_vqshruntq_m_n_s32): Likewise.
(__arm_vqshruntq_m_n_s16): Likewise.
(__arm_vrmlaldavhaq_p_s32): Likewise.
(__arm_vrmlaldavhaq_p_u32): Likewise.
(__arm_vrmlaldavhaxq_p_s32): Likewise.
(__arm_vrmlsldavhaq_p_s32): Likewise.
(__arm_vrmlsldavhaxq_p_s32): Likewise.
(__arm_vrshrnbq_m_n_s32): Likewise.
(__arm_vrshrnbq_m_n_s16): Likewise.
(__arm_vrshrnbq_m_n_u32): Likewise.
(__arm_vrshrnbq_m_n_u16): Likewise.
(__arm_vrshrntq_m_n_s32): Likewise.
(__arm_vrshrntq_m_n_s16): Likewise.
(__arm_vrshrntq_m_n_u32): Likewise.
(__arm_vrshrntq_m_n_u16): Likewise.
(__arm_vshllbq_m_n_s8): Likewise.
(__arm_vshllbq_m_n_s16): Likewise.
(__arm_vshllbq_m_n_u8): Likewise.
(__arm_vshllbq_m_n_u16): Likewise.
(__arm_vshlltq_m_n_s8): Likewise.
(__arm_vshlltq_m_n_s16): Likewise.
(__arm_vshlltq_m_n_u8): Likewise.
(__arm_vshlltq_m_n_u16): Likewise.
(__arm_vshrnbq_m_n_s32): Likewise.
(__arm_vshrnbq_m_n_s16): Likewise.
(__arm_vshrnbq_m_n_u32): Likewise.
(__arm_vshrnbq_m_n_u16): Likewise.
(__arm_vshrntq_m_n_s32): Likewise.
(__arm_vshrntq_m_n_s16): Likewise.
(__arm_vshrntq_m_n_u32): Likewise.
(__arm_vshrntq_m_n_u16): Likewise.
(vmullbq_poly_m): Define polymorphic variant.
(vmulltq_poly_m): Likewise.
(vshllbq_m): Likewise.
(vshrntq_m_n): Likewise.
(vshrnbq_m_n): Likewise.
(vshlltq_m_n): Likewise.
(vshllbq_m_n): Likewise.
(vrshrntq_m_n): Likewise.
(vrshrnbq_m_n): Likewise.
(vqshruntq_m_n): Likewise.
(vqshrunbq_m_n): Likewise.
(vqdmullbq_m_n): Likewise.
(vqdmullbq_m): Likewise.
(vqdmulltq_m_n): Likewise.
(vqdmulltq_m): Likewise.
(vqrshrnbq_m_n): Likewise.
(vqrshrntq_m_n): Likewise.
(vqrshrunbq_m_n): Likewise.
(vqrshruntq_m_n): Likewise.
(vqshrnbq_m_n): Likewise.
(vqshrntq_m_n): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifiers.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (VMLALDAVAQ_P): Define iterator.
(VMLALDAVAXQ_P): Likewise.
(VQRSHRNBQ_M_N): Likewise.
(VQRSHRNTQ_M_N): Likewise.
(VQSHRNBQ_M_N): Likewise.
(VQSHRNTQ_M_N): Likewise.
(VRSHRNBQ_M_N): Likewise.
(VRSHRNTQ_M_N): Likewise.
(VSHLLBQ_M_N): Likewise.
(VSHLLTQ_M_N): Likewise.
(VSHRNBQ_M_N): Likewise.
(VSHRNTQ_M_N): Likewise.
(mve_vmlaldavaq_p_<supf><mode>): Define RTL pattern.
(mve_vmlaldavaxq_p_<supf><mode>): Likewise.
(mve_vqrshrnbq_m_n_<supf><mode>): Likewise.
(mve_vqrshrntq_m_n_<supf><mode>): Likewise.
(mve_vqshrnbq_m_n_<supf><mode>): Likewise.
(mve_vqshrntq_m_n_<supf><mode>): Likewise.
(mve_vrmlaldavhaq_p_sv4si): Likewise.
(mve_vrshrnbq_m_n_<supf><mode>): Likewise.
(mve_vrshrntq_m_n_<supf><mode>): Likewise.
(mve_vshllbq_m_n_<supf><mode>): Likewise.
(mve_vshlltq_m_n_<supf><mode>): Likewise.
(mve_vshrnbq_m_n_<supf><mode>): Likewise.
(mve_vshrntq_m_n_<supf><mode>): Likewise.
(mve_vmlsldavaq_p_s<mode>): Likewise.
(mve_vmlsldavaxq_p_s<mode>): Likewise.
(mve_vmullbq_poly_m_p<mode>): Likewise.
(mve_vmulltq_poly_m_p<mode>): Likewise.
(mve_vqdmullbq_m_n_s<mode>): Likewise.
(mve_vqdmullbq_m_s<mode>): Likewise.
(mve_vqdmulltq_m_n_s<mode>): Likewise.
(mve_vqdmulltq_m_s<mode>): Likewise.
(mve_vqrshrunbq_m_n_s<mode>): Likewise.
(mve_vqrshruntq_m_n_s<mode>): Likewise.
(mve_vqshrunbq_m_n_s<mode>): Likewise.
(mve_vqshruntq_m_n_s<mode>): Likewise.
(mve_vrmlaldavhaq_p_uv4si): Likewise.
(mve_vrmlaldavhaxq_p_sv4si): Likewise.
(mve_vrmlsldavhaq_p_sv4si): Likewise.
(mve_vrmlsldavhaxq_p_sv4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_m_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_poly_m_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_m_p16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_poly_m_p8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmullbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulltq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshllbq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlltq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_m_n_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:58:10 +0000 (16:58 +0000)]
[ARM][GCC][2/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vabdq_m_s8, vabdq_m_s32, vabdq_m_s16, vabdq_m_u8, vabdq_m_u32, vabdq_m_u16, vaddq_m_n_s8, vaddq_m_n_s32, vaddq_m_n_s16, vaddq_m_n_u8, vaddq_m_n_u32, vaddq_m_n_u16, vaddq_m_s8, vaddq_m_s32, vaddq_m_s16, vaddq_m_u8, vaddq_m_u32, vaddq_m_u16, vandq_m_s8, vandq_m_s32, vandq_m_s16, vandq_m_u8, vandq_m_u32, vandq_m_u16, vbicq_m_s8, vbicq_m_s32, vbicq_m_s16, vbicq_m_u8, vbicq_m_u32, vbicq_m_u16, vbrsrq_m_n_s8, vbrsrq_m_n_s32, vbrsrq_m_n_s16, vbrsrq_m_n_u8, vbrsrq_m_n_u32, vbrsrq_m_n_u16, vcaddq_rot270_m_s8, vcaddq_rot270_m_s32, vcaddq_rot270_m_s16, vcaddq_rot270_m_u8, vcaddq_rot270_m_u32, vcaddq_rot270_m_u16, vcaddq_rot90_m_s8, vcaddq_rot90_m_s32, vcaddq_rot90_m_s16, vcaddq_rot90_m_u8, vcaddq_rot90_m_u32, vcaddq_rot90_m_u16, veorq_m_s8, veorq_m_s32, veorq_m_s16, veorq_m_u8, veorq_m_u32, veorq_m_u16, vhaddq_m_n_s8, vhaddq_m_n_s32, vhaddq_m_n_s16, vhaddq_m_n_u8, vhaddq_m_n_u32, vhaddq_m_n_u16, vhaddq_m_s8, vhaddq_m_s32, vhaddq_m_s16, vhaddq_m_u8, vhaddq_m_u32, vhaddq_m_u16, vhcaddq_rot270_m_s8, vhcaddq_rot270_m_s32, vhcaddq_rot270_m_s16, vhcaddq_rot90_m_s8, vhcaddq_rot90_m_s32, vhcaddq_rot90_m_s16, vhsubq_m_n_s8, vhsubq_m_n_s32, vhsubq_m_n_s16, vhsubq_m_n_u8, vhsubq_m_n_u32, vhsubq_m_n_u16, vhsubq_m_s8, vhsubq_m_s32, vhsubq_m_s16, vhsubq_m_u8, vhsubq_m_u32, vhsubq_m_u16, vmaxq_m_s8, vmaxq_m_s32, vmaxq_m_s16, vmaxq_m_u8, vmaxq_m_u32, vmaxq_m_u16, vminq_m_s8, vminq_m_s32, vminq_m_s16, vminq_m_u8, vminq_m_u32, vminq_m_u16, vmladavaq_p_s8, vmladavaq_p_s32, vmladavaq_p_s16, vmladavaq_p_u8, vmladavaq_p_u32, vmladavaq_p_u16, vmladavaxq_p_s8, vmladavaxq_p_s32, vmladavaxq_p_s16, vmlaq_m_n_s8, vmlaq_m_n_s32, vmlaq_m_n_s16, vmlaq_m_n_u8, vmlaq_m_n_u32, vmlaq_m_n_u16, vmlasq_m_n_s8, vmlasq_m_n_s32, vmlasq_m_n_s16, vmlasq_m_n_u8, vmlasq_m_n_u32, vmlasq_m_n_u16, vmlsdavaq_p_s8, vmlsdavaq_p_s32, vmlsdavaq_p_s16, vmlsdavaxq_p_s8, vmlsdavaxq_p_s32, vmlsdavaxq_p_s16, vmulhq_m_s8, vmulhq_m_s32, vmulhq_m_s16, vmulhq_m_u8, vmulhq_m_u32, vmulhq_m_u16, vmullbq_int_m_s8, vmullbq_int_m_s32, vmullbq_int_m_s16, vmullbq_int_m_u8, vmullbq_int_m_u32, vmullbq_int_m_u16, vmulltq_int_m_s8, vmulltq_int_m_s32, vmulltq_int_m_s16, vmulltq_int_m_u8, vmulltq_int_m_u32, vmulltq_int_m_u16, vmulq_m_n_s8, vmulq_m_n_s32, vmulq_m_n_s16, vmulq_m_n_u8, vmulq_m_n_u32, vmulq_m_n_u16, vmulq_m_s8, vmulq_m_s32, vmulq_m_s16, vmulq_m_u8, vmulq_m_u32, vmulq_m_u16, vornq_m_s8, vornq_m_s32, vornq_m_s16, vornq_m_u8, vornq_m_u32, vornq_m_u16, vorrq_m_s8, vorrq_m_s32, vorrq_m_s16, vorrq_m_u8, vorrq_m_u32, vorrq_m_u16, vqaddq_m_n_s8, vqaddq_m_n_s32, vqaddq_m_n_s16, vqaddq_m_n_u8, vqaddq_m_n_u32, vqaddq_m_n_u16, vqaddq_m_s8, vqaddq_m_s32, vqaddq_m_s16, vqaddq_m_u8, vqaddq_m_u32, vqaddq_m_u16, vqdmladhq_m_s8, vqdmladhq_m_s32, vqdmladhq_m_s16, vqdmladhxq_m_s8, vqdmladhxq_m_s32, vqdmladhxq_m_s16, vqdmlahq_m_n_s8, vqdmlahq_m_n_s32, vqdmlahq_m_n_s16, vqdmlahq_m_n_u8, vqdmlahq_m_n_u32, vqdmlahq_m_n_u16, vqdmlsdhq_m_s8, vqdmlsdhq_m_s32, vqdmlsdhq_m_s16, vqdmlsdhxq_m_s8, vqdmlsdhxq_m_s32, vqdmlsdhxq_m_s16, vqdmulhq_m_n_s8, vqdmulhq_m_n_s32, vqdmulhq_m_n_s16, vqdmulhq_m_s8, vqdmulhq_m_s32, vqdmulhq_m_s16, vqrdmladhq_m_s8, vqrdmladhq_m_s32, vqrdmladhq_m_s16, vqrdmladhxq_m_s8, vqrdmladhxq_m_s32, vqrdmladhxq_m_s16, vqrdmlahq_m_n_s8, vqrdmlahq_m_n_s32, vqrdmlahq_m_n_s16, vqrdmlahq_m_n_u8, vqrdmlahq_m_n_u32, vqrdmlahq_m_n_u16, vqrdmlashq_m_n_s8, vqrdmlashq_m_n_s32, vqrdmlashq_m_n_s16, vqrdmlashq_m_n_u8, vqrdmlashq_m_n_u32, vqrdmlashq_m_n_u16, vqrdmlsdhq_m_s8, vqrdmlsdhq_m_s32, vqrdmlsdhq_m_s16, vqrdmlsdhxq_m_s8, vqrdmlsdhxq_m_s32, vqrdmlsdhxq_m_s16, vqrdmulhq_m_n_s8, vqrdmulhq_m_n_s32, vqrdmulhq_m_n_s16, vqrdmulhq_m_s8, vqrdmulhq_m_s32, vqrdmulhq_m_s16, vqrshlq_m_s8, vqrshlq_m_s32, vqrshlq_m_s16, vqrshlq_m_u8, vqrshlq_m_u32, vqrshlq_m_u16, vqshlq_m_n_s8, vqshlq_m_n_s32, vqshlq_m_n_s16, vqshlq_m_n_u8, vqshlq_m_n_u32, vqshlq_m_n_u16, vqshlq_m_s8, vqshlq_m_s32, vqshlq_m_s16, vqshlq_m_u8, vqshlq_m_u32, vqshlq_m_u16, vqsubq_m_n_s8, vqsubq_m_n_s32, vqsubq_m_n_s16, vqsubq_m_n_u8, vqsubq_m_n_u32, vqsubq_m_n_u16, vqsubq_m_s8, vqsubq_m_s32, vqsubq_m_s16, vqsubq_m_u8, vqsubq_m_u32, vqsubq_m_u16, vrhaddq_m_s8, vrhaddq_m_s32, vrhaddq_m_s16, vrhaddq_m_u8, vrhaddq_m_u32, vrhaddq_m_u16, vrmulhq_m_s8, vrmulhq_m_s32, vrmulhq_m_s16, vrmulhq_m_u8, vrmulhq_m_u32, vrmulhq_m_u16, vrshlq_m_s8, vrshlq_m_s32, vrshlq_m_s16, vrshlq_m_u8, vrshlq_m_u32, vrshlq_m_u16, vrshrq_m_n_s8, vrshrq_m_n_s32, vrshrq_m_n_s16, vrshrq_m_n_u8, vrshrq_m_n_u32, vrshrq_m_n_u16, vshlq_m_n_s8, vshlq_m_n_s32, vshlq_m_n_s16, vshlq_m_n_u8, vshlq_m_n_u32, vshlq_m_n_u16, vshrq_m_n_s8, vshrq_m_n_s32, vshrq_m_n_s16, vshrq_m_n_u8, vshrq_m_n_u32, vshrq_m_n_u16, vsliq_m_n_s8, vsliq_m_n_s32, vsliq_m_n_s16, vsliq_m_n_u8, vsliq_m_n_u32, vsliq_m_n_u16, vsubq_m_n_s8, vsubq_m_n_s32, vsubq_m_n_s16, vsubq_m_n_u8, vsubq_m_n_u32, vsubq_m_n_u16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vabdq_m_s8): Define macro.
(vabdq_m_s32): Likewise.
(vabdq_m_s16): Likewise.
(vabdq_m_u8): Likewise.
(vabdq_m_u32): Likewise.
(vabdq_m_u16): Likewise.
(vaddq_m_n_s8): Likewise.
(vaddq_m_n_s32): Likewise.
(vaddq_m_n_s16): Likewise.
(vaddq_m_n_u8): Likewise.
(vaddq_m_n_u32): Likewise.
(vaddq_m_n_u16): Likewise.
(vaddq_m_s8): Likewise.
(vaddq_m_s32): Likewise.
(vaddq_m_s16): Likewise.
(vaddq_m_u8): Likewise.
(vaddq_m_u32): Likewise.
(vaddq_m_u16): Likewise.
(vandq_m_s8): Likewise.
(vandq_m_s32): Likewise.
(vandq_m_s16): Likewise.
(vandq_m_u8): Likewise.
(vandq_m_u32): Likewise.
(vandq_m_u16): Likewise.
(vbicq_m_s8): Likewise.
(vbicq_m_s32): Likewise.
(vbicq_m_s16): Likewise.
(vbicq_m_u8): Likewise.
(vbicq_m_u32): Likewise.
(vbicq_m_u16): Likewise.
(vbrsrq_m_n_s8): Likewise.
(vbrsrq_m_n_s32): Likewise.
(vbrsrq_m_n_s16): Likewise.
(vbrsrq_m_n_u8): Likewise.
(vbrsrq_m_n_u32): Likewise.
(vbrsrq_m_n_u16): Likewise.
(vcaddq_rot270_m_s8): Likewise.
(vcaddq_rot270_m_s32): Likewise.
(vcaddq_rot270_m_s16): Likewise.
(vcaddq_rot270_m_u8): Likewise.
(vcaddq_rot270_m_u32): Likewise.
(vcaddq_rot270_m_u16): Likewise.
(vcaddq_rot90_m_s8): Likewise.
(vcaddq_rot90_m_s32): Likewise.
(vcaddq_rot90_m_s16): Likewise.
(vcaddq_rot90_m_u8): Likewise.
(vcaddq_rot90_m_u32): Likewise.
(vcaddq_rot90_m_u16): Likewise.
(veorq_m_s8): Likewise.
(veorq_m_s32): Likewise.
(veorq_m_s16): Likewise.
(veorq_m_u8): Likewise.
(veorq_m_u32): Likewise.
(veorq_m_u16): Likewise.
(vhaddq_m_n_s8): Likewise.
(vhaddq_m_n_s32): Likewise.
(vhaddq_m_n_s16): Likewise.
(vhaddq_m_n_u8): Likewise.
(vhaddq_m_n_u32): Likewise.
(vhaddq_m_n_u16): Likewise.
(vhaddq_m_s8): Likewise.
(vhaddq_m_s32): Likewise.
(vhaddq_m_s16): Likewise.
(vhaddq_m_u8): Likewise.
(vhaddq_m_u32): Likewise.
(vhaddq_m_u16): Likewise.
(vhcaddq_rot270_m_s8): Likewise.
(vhcaddq_rot270_m_s32): Likewise.
(vhcaddq_rot270_m_s16): Likewise.
(vhcaddq_rot90_m_s8): Likewise.
(vhcaddq_rot90_m_s32): Likewise.
(vhcaddq_rot90_m_s16): Likewise.
(vhsubq_m_n_s8): Likewise.
(vhsubq_m_n_s32): Likewise.
(vhsubq_m_n_s16): Likewise.
(vhsubq_m_n_u8): Likewise.
(vhsubq_m_n_u32): Likewise.
(vhsubq_m_n_u16): Likewise.
(vhsubq_m_s8): Likewise.
(vhsubq_m_s32): Likewise.
(vhsubq_m_s16): Likewise.
(vhsubq_m_u8): Likewise.
(vhsubq_m_u32): Likewise.
(vhsubq_m_u16): Likewise.
(vmaxq_m_s8): Likewise.
(vmaxq_m_s32): Likewise.
(vmaxq_m_s16): Likewise.
(vmaxq_m_u8): Likewise.
(vmaxq_m_u32): Likewise.
(vmaxq_m_u16): Likewise.
(vminq_m_s8): Likewise.
(vminq_m_s32): Likewise.
(vminq_m_s16): Likewise.
(vminq_m_u8): Likewise.
(vminq_m_u32): Likewise.
(vminq_m_u16): Likewise.
(vmladavaq_p_s8): Likewise.
(vmladavaq_p_s32): Likewise.
(vmladavaq_p_s16): Likewise.
(vmladavaq_p_u8): Likewise.
(vmladavaq_p_u32): Likewise.
(vmladavaq_p_u16): Likewise.
(vmladavaxq_p_s8): Likewise.
(vmladavaxq_p_s32): Likewise.
(vmladavaxq_p_s16): Likewise.
(vmlaq_m_n_s8): Likewise.
(vmlaq_m_n_s32): Likewise.
(vmlaq_m_n_s16): Likewise.
(vmlaq_m_n_u8): Likewise.
(vmlaq_m_n_u32): Likewise.
(vmlaq_m_n_u16): Likewise.
(vmlasq_m_n_s8): Likewise.
(vmlasq_m_n_s32): Likewise.
(vmlasq_m_n_s16): Likewise.
(vmlasq_m_n_u8): Likewise.
(vmlasq_m_n_u32): Likewise.
(vmlasq_m_n_u16): Likewise.
(vmlsdavaq_p_s8): Likewise.
(vmlsdavaq_p_s32): Likewise.
(vmlsdavaq_p_s16): Likewise.
(vmlsdavaxq_p_s8): Likewise.
(vmlsdavaxq_p_s32): Likewise.
(vmlsdavaxq_p_s16): Likewise.
(vmulhq_m_s8): Likewise.
(vmulhq_m_s32): Likewise.
(vmulhq_m_s16): Likewise.
(vmulhq_m_u8): Likewise.
(vmulhq_m_u32): Likewise.
(vmulhq_m_u16): Likewise.
(vmullbq_int_m_s8): Likewise.
(vmullbq_int_m_s32): Likewise.
(vmullbq_int_m_s16): Likewise.
(vmullbq_int_m_u8): Likewise.
(vmullbq_int_m_u32): Likewise.
(vmullbq_int_m_u16): Likewise.
(vmulltq_int_m_s8): Likewise.
(vmulltq_int_m_s32): Likewise.
(vmulltq_int_m_s16): Likewise.
(vmulltq_int_m_u8): Likewise.
(vmulltq_int_m_u32): Likewise.
(vmulltq_int_m_u16): Likewise.
(vmulq_m_n_s8): Likewise.
(vmulq_m_n_s32): Likewise.
(vmulq_m_n_s16): Likewise.
(vmulq_m_n_u8): Likewise.
(vmulq_m_n_u32): Likewise.
(vmulq_m_n_u16): Likewise.
(vmulq_m_s8): Likewise.
(vmulq_m_s32): Likewise.
(vmulq_m_s16): Likewise.
(vmulq_m_u8): Likewise.
(vmulq_m_u32): Likewise.
(vmulq_m_u16): Likewise.
(vornq_m_s8): Likewise.
(vornq_m_s32): Likewise.
(vornq_m_s16): Likewise.
(vornq_m_u8): Likewise.
(vornq_m_u32): Likewise.
(vornq_m_u16): Likewise.
(vorrq_m_s8): Likewise.
(vorrq_m_s32): Likewise.
(vorrq_m_s16): Likewise.
(vorrq_m_u8): Likewise.
(vorrq_m_u32): Likewise.
(vorrq_m_u16): Likewise.
(vqaddq_m_n_s8): Likewise.
(vqaddq_m_n_s32): Likewise.
(vqaddq_m_n_s16): Likewise.
(vqaddq_m_n_u8): Likewise.
(vqaddq_m_n_u32): Likewise.
(vqaddq_m_n_u16): Likewise.
(vqaddq_m_s8): Likewise.
(vqaddq_m_s32): Likewise.
(vqaddq_m_s16): Likewise.
(vqaddq_m_u8): Likewise.
(vqaddq_m_u32): Likewise.
(vqaddq_m_u16): Likewise.
(vqdmladhq_m_s8): Likewise.
(vqdmladhq_m_s32): Likewise.
(vqdmladhq_m_s16): Likewise.
(vqdmladhxq_m_s8): Likewise.
(vqdmladhxq_m_s32): Likewise.
(vqdmladhxq_m_s16): Likewise.
(vqdmlahq_m_n_s8): Likewise.
(vqdmlahq_m_n_s32): Likewise.
(vqdmlahq_m_n_s16): Likewise.
(vqdmlahq_m_n_u8): Likewise.
(vqdmlahq_m_n_u32): Likewise.
(vqdmlahq_m_n_u16): Likewise.
(vqdmlsdhq_m_s8): Likewise.
(vqdmlsdhq_m_s32): Likewise.
(vqdmlsdhq_m_s16): Likewise.
(vqdmlsdhxq_m_s8): Likewise.
(vqdmlsdhxq_m_s32): Likewise.
(vqdmlsdhxq_m_s16): Likewise.
(vqdmulhq_m_n_s8): Likewise.
(vqdmulhq_m_n_s32): Likewise.
(vqdmulhq_m_n_s16): Likewise.
(vqdmulhq_m_s8): Likewise.
(vqdmulhq_m_s32): Likewise.
(vqdmulhq_m_s16): Likewise.
(vqrdmladhq_m_s8): Likewise.
(vqrdmladhq_m_s32): Likewise.
(vqrdmladhq_m_s16): Likewise.
(vqrdmladhxq_m_s8): Likewise.
(vqrdmladhxq_m_s32): Likewise.
(vqrdmladhxq_m_s16): Likewise.
(vqrdmlahq_m_n_s8): Likewise.
(vqrdmlahq_m_n_s32): Likewise.
(vqrdmlahq_m_n_s16): Likewise.
(vqrdmlahq_m_n_u8): Likewise.
(vqrdmlahq_m_n_u32): Likewise.
(vqrdmlahq_m_n_u16): Likewise.
(vqrdmlashq_m_n_s8): Likewise.
(vqrdmlashq_m_n_s32): Likewise.
(vqrdmlashq_m_n_s16): Likewise.
(vqrdmlashq_m_n_u8): Likewise.
(vqrdmlashq_m_n_u32): Likewise.
(vqrdmlashq_m_n_u16): Likewise.
(vqrdmlsdhq_m_s8): Likewise.
(vqrdmlsdhq_m_s32): Likewise.
(vqrdmlsdhq_m_s16): Likewise.
(vqrdmlsdhxq_m_s8): Likewise.
(vqrdmlsdhxq_m_s32): Likewise.
(vqrdmlsdhxq_m_s16): Likewise.
(vqrdmulhq_m_n_s8): Likewise.
(vqrdmulhq_m_n_s32): Likewise.
(vqrdmulhq_m_n_s16): Likewise.
(vqrdmulhq_m_s8): Likewise.
(vqrdmulhq_m_s32): Likewise.
(vqrdmulhq_m_s16): Likewise.
(vqrshlq_m_s8): Likewise.
(vqrshlq_m_s32): Likewise.
(vqrshlq_m_s16): Likewise.
(vqrshlq_m_u8): Likewise.
(vqrshlq_m_u32): Likewise.
(vqrshlq_m_u16): Likewise.
(vqshlq_m_n_s8): Likewise.
(vqshlq_m_n_s32): Likewise.
(vqshlq_m_n_s16): Likewise.
(vqshlq_m_n_u8): Likewise.
(vqshlq_m_n_u32): Likewise.
(vqshlq_m_n_u16): Likewise.
(vqshlq_m_s8): Likewise.
(vqshlq_m_s32): Likewise.
(vqshlq_m_s16): Likewise.
(vqshlq_m_u8): Likewise.
(vqshlq_m_u32): Likewise.
(vqshlq_m_u16): Likewise.
(vqsubq_m_n_s8): Likewise.
(vqsubq_m_n_s32): Likewise.
(vqsubq_m_n_s16): Likewise.
(vqsubq_m_n_u8): Likewise.
(vqsubq_m_n_u32): Likewise.
(vqsubq_m_n_u16): Likewise.
(vqsubq_m_s8): Likewise.
(vqsubq_m_s32): Likewise.
(vqsubq_m_s16): Likewise.
(vqsubq_m_u8): Likewise.
(vqsubq_m_u32): Likewise.
(vqsubq_m_u16): Likewise.
(vrhaddq_m_s8): Likewise.
(vrhaddq_m_s32): Likewise.
(vrhaddq_m_s16): Likewise.
(vrhaddq_m_u8): Likewise.
(vrhaddq_m_u32): Likewise.
(vrhaddq_m_u16): Likewise.
(vrmulhq_m_s8): Likewise.
(vrmulhq_m_s32): Likewise.
(vrmulhq_m_s16): Likewise.
(vrmulhq_m_u8): Likewise.
(vrmulhq_m_u32): Likewise.
(vrmulhq_m_u16): Likewise.
(vrshlq_m_s8): Likewise.
(vrshlq_m_s32): Likewise.
(vrshlq_m_s16): Likewise.
(vrshlq_m_u8): Likewise.
(vrshlq_m_u32): Likewise.
(vrshlq_m_u16): Likewise.
(vrshrq_m_n_s8): Likewise.
(vrshrq_m_n_s32): Likewise.
(vrshrq_m_n_s16): Likewise.
(vrshrq_m_n_u8): Likewise.
(vrshrq_m_n_u32): Likewise.
(vrshrq_m_n_u16): Likewise.
(vshlq_m_n_s8): Likewise.
(vshlq_m_n_s32): Likewise.
(vshlq_m_n_s16): Likewise.
(vshlq_m_n_u8): Likewise.
(vshlq_m_n_u32): Likewise.
(vshlq_m_n_u16): Likewise.
(vshrq_m_n_s8): Likewise.
(vshrq_m_n_s32): Likewise.
(vshrq_m_n_s16): Likewise.
(vshrq_m_n_u8): Likewise.
(vshrq_m_n_u32): Likewise.
(vshrq_m_n_u16): Likewise.
(vsliq_m_n_s8): Likewise.
(vsliq_m_n_s32): Likewise.
(vsliq_m_n_s16): Likewise.
(vsliq_m_n_u8): Likewise.
(vsliq_m_n_u32): Likewise.
(vsliq_m_n_u16): Likewise.
(vsubq_m_n_s8): Likewise.
(vsubq_m_n_s32): Likewise.
(vsubq_m_n_s16): Likewise.
(vsubq_m_n_u8): Likewise.
(vsubq_m_n_u32): Likewise.
(vsubq_m_n_u16): Likewise.
(__arm_vabdq_m_s8): Define intrinsic.
(__arm_vabdq_m_s32): Likewise.
(__arm_vabdq_m_s16): Likewise.
(__arm_vabdq_m_u8): Likewise.
(__arm_vabdq_m_u32): Likewise.
(__arm_vabdq_m_u16): Likewise.
(__arm_vaddq_m_n_s8): Likewise.
(__arm_vaddq_m_n_s32): Likewise.
(__arm_vaddq_m_n_s16): Likewise.
(__arm_vaddq_m_n_u8): Likewise.
(__arm_vaddq_m_n_u32): Likewise.
(__arm_vaddq_m_n_u16): Likewise.
(__arm_vaddq_m_s8): Likewise.
(__arm_vaddq_m_s32): Likewise.
(__arm_vaddq_m_s16): Likewise.
(__arm_vaddq_m_u8): Likewise.
(__arm_vaddq_m_u32): Likewise.
(__arm_vaddq_m_u16): Likewise.
(__arm_vandq_m_s8): Likewise.
(__arm_vandq_m_s32): Likewise.
(__arm_vandq_m_s16): Likewise.
(__arm_vandq_m_u8): Likewise.
(__arm_vandq_m_u32): Likewise.
(__arm_vandq_m_u16): Likewise.
(__arm_vbicq_m_s8): Likewise.
(__arm_vbicq_m_s32): Likewise.
(__arm_vbicq_m_s16): Likewise.
(__arm_vbicq_m_u8): Likewise.
(__arm_vbicq_m_u32): Likewise.
(__arm_vbicq_m_u16): Likewise.
(__arm_vbrsrq_m_n_s8): Likewise.
(__arm_vbrsrq_m_n_s32): Likewise.
(__arm_vbrsrq_m_n_s16): Likewise.
(__arm_vbrsrq_m_n_u8): Likewise.
(__arm_vbrsrq_m_n_u32): Likewise.
(__arm_vbrsrq_m_n_u16): Likewise.
(__arm_vcaddq_rot270_m_s8): Likewise.
(__arm_vcaddq_rot270_m_s32): Likewise.
(__arm_vcaddq_rot270_m_s16): Likewise.
(__arm_vcaddq_rot270_m_u8): Likewise.
(__arm_vcaddq_rot270_m_u32): Likewise.
(__arm_vcaddq_rot270_m_u16): Likewise.
(__arm_vcaddq_rot90_m_s8): Likewise.
(__arm_vcaddq_rot90_m_s32): Likewise.
(__arm_vcaddq_rot90_m_s16): Likewise.
(__arm_vcaddq_rot90_m_u8): Likewise.
(__arm_vcaddq_rot90_m_u32): Likewise.
(__arm_vcaddq_rot90_m_u16): Likewise.
(__arm_veorq_m_s8): Likewise.
(__arm_veorq_m_s32): Likewise.
(__arm_veorq_m_s16): Likewise.
(__arm_veorq_m_u8): Likewise.
(__arm_veorq_m_u32): Likewise.
(__arm_veorq_m_u16): Likewise.
(__arm_vhaddq_m_n_s8): Likewise.
(__arm_vhaddq_m_n_s32): Likewise.
(__arm_vhaddq_m_n_s16): Likewise.
(__arm_vhaddq_m_n_u8): Likewise.
(__arm_vhaddq_m_n_u32): Likewise.
(__arm_vhaddq_m_n_u16): Likewise.
(__arm_vhaddq_m_s8): Likewise.
(__arm_vhaddq_m_s32): Likewise.
(__arm_vhaddq_m_s16): Likewise.
(__arm_vhaddq_m_u8): Likewise.
(__arm_vhaddq_m_u32): Likewise.
(__arm_vhaddq_m_u16): Likewise.
(__arm_vhcaddq_rot270_m_s8): Likewise.
(__arm_vhcaddq_rot270_m_s32): Likewise.
(__arm_vhcaddq_rot270_m_s16): Likewise.
(__arm_vhcaddq_rot90_m_s8): Likewise.
(__arm_vhcaddq_rot90_m_s32): Likewise.
(__arm_vhcaddq_rot90_m_s16): Likewise.
(__arm_vhsubq_m_n_s8): Likewise.
(__arm_vhsubq_m_n_s32): Likewise.
(__arm_vhsubq_m_n_s16): Likewise.
(__arm_vhsubq_m_n_u8): Likewise.
(__arm_vhsubq_m_n_u32): Likewise.
(__arm_vhsubq_m_n_u16): Likewise.
(__arm_vhsubq_m_s8): Likewise.
(__arm_vhsubq_m_s32): Likewise.
(__arm_vhsubq_m_s16): Likewise.
(__arm_vhsubq_m_u8): Likewise.
(__arm_vhsubq_m_u32): Likewise.
(__arm_vhsubq_m_u16): Likewise.
(__arm_vmaxq_m_s8): Likewise.
(__arm_vmaxq_m_s32): Likewise.
(__arm_vmaxq_m_s16): Likewise.
(__arm_vmaxq_m_u8): Likewise.
(__arm_vmaxq_m_u32): Likewise.
(__arm_vmaxq_m_u16): Likewise.
(__arm_vminq_m_s8): Likewise.
(__arm_vminq_m_s32): Likewise.
(__arm_vminq_m_s16): Likewise.
(__arm_vminq_m_u8): Likewise.
(__arm_vminq_m_u32): Likewise.
(__arm_vminq_m_u16): Likewise.
(__arm_vmladavaq_p_s8): Likewise.
(__arm_vmladavaq_p_s32): Likewise.
(__arm_vmladavaq_p_s16): Likewise.
(__arm_vmladavaq_p_u8): Likewise.
(__arm_vmladavaq_p_u32): Likewise.
(__arm_vmladavaq_p_u16): Likewise.
(__arm_vmladavaxq_p_s8): Likewise.
(__arm_vmladavaxq_p_s32): Likewise.
(__arm_vmladavaxq_p_s16): Likewise.
(__arm_vmlaq_m_n_s8): Likewise.
(__arm_vmlaq_m_n_s32): Likewise.
(__arm_vmlaq_m_n_s16): Likewise.
(__arm_vmlaq_m_n_u8): Likewise.
(__arm_vmlaq_m_n_u32): Likewise.
(__arm_vmlaq_m_n_u16): Likewise.
(__arm_vmlasq_m_n_s8): Likewise.
(__arm_vmlasq_m_n_s32): Likewise.
(__arm_vmlasq_m_n_s16): Likewise.
(__arm_vmlasq_m_n_u8): Likewise.
(__arm_vmlasq_m_n_u32): Likewise.
(__arm_vmlasq_m_n_u16): Likewise.
(__arm_vmlsdavaq_p_s8): Likewise.
(__arm_vmlsdavaq_p_s32): Likewise.
(__arm_vmlsdavaq_p_s16): Likewise.
(__arm_vmlsdavaxq_p_s8): Likewise.
(__arm_vmlsdavaxq_p_s32): Likewise.
(__arm_vmlsdavaxq_p_s16): Likewise.
(__arm_vmulhq_m_s8): Likewise.
(__arm_vmulhq_m_s32): Likewise.
(__arm_vmulhq_m_s16): Likewise.
(__arm_vmulhq_m_u8): Likewise.
(__arm_vmulhq_m_u32): Likewise.
(__arm_vmulhq_m_u16): Likewise.
(__arm_vmullbq_int_m_s8): Likewise.
(__arm_vmullbq_int_m_s32): Likewise.
(__arm_vmullbq_int_m_s16): Likewise.
(__arm_vmullbq_int_m_u8): Likewise.
(__arm_vmullbq_int_m_u32): Likewise.
(__arm_vmullbq_int_m_u16): Likewise.
(__arm_vmulltq_int_m_s8): Likewise.
(__arm_vmulltq_int_m_s32): Likewise.
(__arm_vmulltq_int_m_s16): Likewise.
(__arm_vmulltq_int_m_u8): Likewise.
(__arm_vmulltq_int_m_u32): Likewise.
(__arm_vmulltq_int_m_u16): Likewise.
(__arm_vmulq_m_n_s8): Likewise.
(__arm_vmulq_m_n_s32): Likewise.
(__arm_vmulq_m_n_s16): Likewise.
(__arm_vmulq_m_n_u8): Likewise.
(__arm_vmulq_m_n_u32): Likewise.
(__arm_vmulq_m_n_u16): Likewise.
(__arm_vmulq_m_s8): Likewise.
(__arm_vmulq_m_s32): Likewise.
(__arm_vmulq_m_s16): Likewise.
(__arm_vmulq_m_u8): Likewise.
(__arm_vmulq_m_u32): Likewise.
(__arm_vmulq_m_u16): Likewise.
(__arm_vornq_m_s8): Likewise.
(__arm_vornq_m_s32): Likewise.
(__arm_vornq_m_s16): Likewise.
(__arm_vornq_m_u8): Likewise.
(__arm_vornq_m_u32): Likewise.
(__arm_vornq_m_u16): Likewise.
(__arm_vorrq_m_s8): Likewise.
(__arm_vorrq_m_s32): Likewise.
(__arm_vorrq_m_s16): Likewise.
(__arm_vorrq_m_u8): Likewise.
(__arm_vorrq_m_u32): Likewise.
(__arm_vorrq_m_u16): Likewise.
(__arm_vqaddq_m_n_s8): Likewise.
(__arm_vqaddq_m_n_s32): Likewise.
(__arm_vqaddq_m_n_s16): Likewise.
(__arm_vqaddq_m_n_u8): Likewise.
(__arm_vqaddq_m_n_u32): Likewise.
(__arm_vqaddq_m_n_u16): Likewise.
(__arm_vqaddq_m_s8): Likewise.
(__arm_vqaddq_m_s32): Likewise.
(__arm_vqaddq_m_s16): Likewise.
(__arm_vqaddq_m_u8): Likewise.
(__arm_vqaddq_m_u32): Likewise.
(__arm_vqaddq_m_u16): Likewise.
(__arm_vqdmladhq_m_s8): Likewise.
(__arm_vqdmladhq_m_s32): Likewise.
(__arm_vqdmladhq_m_s16): Likewise.
(__arm_vqdmladhxq_m_s8): Likewise.
(__arm_vqdmladhxq_m_s32): Likewise.
(__arm_vqdmladhxq_m_s16): Likewise.
(__arm_vqdmlahq_m_n_s8): Likewise.
(__arm_vqdmlahq_m_n_s32): Likewise.
(__arm_vqdmlahq_m_n_s16): Likewise.
(__arm_vqdmlahq_m_n_u8): Likewise.
(__arm_vqdmlahq_m_n_u32): Likewise.
(__arm_vqdmlahq_m_n_u16): Likewise.
(__arm_vqdmlsdhq_m_s8): Likewise.
(__arm_vqdmlsdhq_m_s32): Likewise.
(__arm_vqdmlsdhq_m_s16): Likewise.
(__arm_vqdmlsdhxq_m_s8): Likewise.
(__arm_vqdmlsdhxq_m_s32): Likewise.
(__arm_vqdmlsdhxq_m_s16): Likewise.
(__arm_vqdmulhq_m_n_s8): Likewise.
(__arm_vqdmulhq_m_n_s32): Likewise.
(__arm_vqdmulhq_m_n_s16): Likewise.
(__arm_vqdmulhq_m_s8): Likewise.
(__arm_vqdmulhq_m_s32): Likewise.
(__arm_vqdmulhq_m_s16): Likewise.
(__arm_vqrdmladhq_m_s8): Likewise.
(__arm_vqrdmladhq_m_s32): Likewise.
(__arm_vqrdmladhq_m_s16): Likewise.
(__arm_vqrdmladhxq_m_s8): Likewise.
(__arm_vqrdmladhxq_m_s32): Likewise.
(__arm_vqrdmladhxq_m_s16): Likewise.
(__arm_vqrdmlahq_m_n_s8): Likewise.
(__arm_vqrdmlahq_m_n_s32): Likewise.
(__arm_vqrdmlahq_m_n_s16): Likewise.
(__arm_vqrdmlahq_m_n_u8): Likewise.
(__arm_vqrdmlahq_m_n_u32): Likewise.
(__arm_vqrdmlahq_m_n_u16): Likewise.
(__arm_vqrdmlashq_m_n_s8): Likewise.
(__arm_vqrdmlashq_m_n_s32): Likewise.
(__arm_vqrdmlashq_m_n_s16): Likewise.
(__arm_vqrdmlashq_m_n_u8): Likewise.
(__arm_vqrdmlashq_m_n_u32): Likewise.
(__arm_vqrdmlashq_m_n_u16): Likewise.
(__arm_vqrdmlsdhq_m_s8): Likewise.
(__arm_vqrdmlsdhq_m_s32): Likewise.
(__arm_vqrdmlsdhq_m_s16): Likewise.
(__arm_vqrdmlsdhxq_m_s8): Likewise.
(__arm_vqrdmlsdhxq_m_s32): Likewise.
(__arm_vqrdmlsdhxq_m_s16): Likewise.
(__arm_vqrdmulhq_m_n_s8): Likewise.
(__arm_vqrdmulhq_m_n_s32): Likewise.
(__arm_vqrdmulhq_m_n_s16): Likewise.
(__arm_vqrdmulhq_m_s8): Likewise.
(__arm_vqrdmulhq_m_s32): Likewise.
(__arm_vqrdmulhq_m_s16): Likewise.
(__arm_vqrshlq_m_s8): Likewise.
(__arm_vqrshlq_m_s32): Likewise.
(__arm_vqrshlq_m_s16): Likewise.
(__arm_vqrshlq_m_u8): Likewise.
(__arm_vqrshlq_m_u32): Likewise.
(__arm_vqrshlq_m_u16): Likewise.
(__arm_vqshlq_m_n_s8): Likewise.
(__arm_vqshlq_m_n_s32): Likewise.
(__arm_vqshlq_m_n_s16): Likewise.
(__arm_vqshlq_m_n_u8): Likewise.
(__arm_vqshlq_m_n_u32): Likewise.
(__arm_vqshlq_m_n_u16): Likewise.
(__arm_vqshlq_m_s8): Likewise.
(__arm_vqshlq_m_s32): Likewise.
(__arm_vqshlq_m_s16): Likewise.
(__arm_vqshlq_m_u8): Likewise.
(__arm_vqshlq_m_u32): Likewise.
(__arm_vqshlq_m_u16): Likewise.
(__arm_vqsubq_m_n_s8): Likewise.
(__arm_vqsubq_m_n_s32): Likewise.
(__arm_vqsubq_m_n_s16): Likewise.
(__arm_vqsubq_m_n_u8): Likewise.
(__arm_vqsubq_m_n_u32): Likewise.
(__arm_vqsubq_m_n_u16): Likewise.
(__arm_vqsubq_m_s8): Likewise.
(__arm_vqsubq_m_s32): Likewise.
(__arm_vqsubq_m_s16): Likewise.
(__arm_vqsubq_m_u8): Likewise.
(__arm_vqsubq_m_u32): Likewise.
(__arm_vqsubq_m_u16): Likewise.
(__arm_vrhaddq_m_s8): Likewise.
(__arm_vrhaddq_m_s32): Likewise.
(__arm_vrhaddq_m_s16): Likewise.
(__arm_vrhaddq_m_u8): Likewise.
(__arm_vrhaddq_m_u32): Likewise.
(__arm_vrhaddq_m_u16): Likewise.
(__arm_vrmulhq_m_s8): Likewise.
(__arm_vrmulhq_m_s32): Likewise.
(__arm_vrmulhq_m_s16): Likewise.
(__arm_vrmulhq_m_u8): Likewise.
(__arm_vrmulhq_m_u32): Likewise.
(__arm_vrmulhq_m_u16): Likewise.
(__arm_vrshlq_m_s8): Likewise.
(__arm_vrshlq_m_s32): Likewise.
(__arm_vrshlq_m_s16): Likewise.
(__arm_vrshlq_m_u8): Likewise.
(__arm_vrshlq_m_u32): Likewise.
(__arm_vrshlq_m_u16): Likewise.
(__arm_vrshrq_m_n_s8): Likewise.
(__arm_vrshrq_m_n_s32): Likewise.
(__arm_vrshrq_m_n_s16): Likewise.
(__arm_vrshrq_m_n_u8): Likewise.
(__arm_vrshrq_m_n_u32): Likewise.
(__arm_vrshrq_m_n_u16): Likewise.
(__arm_vshlq_m_n_s8): Likewise.
(__arm_vshlq_m_n_s32): Likewise.
(__arm_vshlq_m_n_s16): Likewise.
(__arm_vshlq_m_n_u8): Likewise.
(__arm_vshlq_m_n_u32): Likewise.
(__arm_vshlq_m_n_u16): Likewise.
(__arm_vshrq_m_n_s8): Likewise.
(__arm_vshrq_m_n_s32): Likewise.
(__arm_vshrq_m_n_s16): Likewise.
(__arm_vshrq_m_n_u8): Likewise.
(__arm_vshrq_m_n_u32): Likewise.
(__arm_vshrq_m_n_u16): Likewise.
(__arm_vsliq_m_n_s8): Likewise.
(__arm_vsliq_m_n_s32): Likewise.
(__arm_vsliq_m_n_s16): Likewise.
(__arm_vsliq_m_n_u8): Likewise.
(__arm_vsliq_m_n_u32): Likewise.
(__arm_vsliq_m_n_u16): Likewise.
(__arm_vsubq_m_n_s8): Likewise.
(__arm_vsubq_m_n_s32): Likewise.
(__arm_vsubq_m_n_s16): Likewise.
(__arm_vsubq_m_n_u8): Likewise.
(__arm_vsubq_m_n_u32): Likewise.
(__arm_vsubq_m_n_u16): Likewise.
(vqdmladhq_m): Define polymorphic variant.
(vqdmladhxq_m): Likewise.
(vqdmlsdhq_m): Likewise.
(vqdmlsdhxq_m): Likewise.
(vabdq_m): Likewise.
(vandq_m): Likewise.
(vbicq_m): Likewise.
(vbrsrq_m_n): Likewise.
(vcaddq_rot270_m): Likewise.
(vcaddq_rot90_m): Likewise.
(veorq_m): Likewise.
(vmaxq_m): Likewise.
(vminq_m): Likewise.
(vmladavaq_p): Likewise.
(vmlaq_m_n): Likewise.
(vmlasq_m_n): Likewise.
(vmulhq_m): Likewise.
(vmullbq_int_m): Likewise.
(vmulltq_int_m): Likewise.
(vornq_m): Likewise.
(vorrq_m): Likewise.
(vqdmlahq_m_n): Likewise.
(vqrdmlahq_m_n): Likewise.
(vqrdmlashq_m_n): Likewise.
(vqrshlq_m): Likewise.
(vqshlq_m_n): Likewise.
(vqshlq_m): Likewise.
(vrhaddq_m): Likewise.
(vrmulhq_m): Likewise.
(vrshlq_m): Likewise.
(vrshrq_m_n): Likewise.
(vshlq_m_n): Likewise.
(vshrq_m_n): Likewise.
(vsliq_m): Likewise.
(vaddq_m_n): Likewise.
(vaddq_m): Likewise.
(vhaddq_m_n): Likewise.
(vhaddq_m): Likewise.
(vhcaddq_rot270_m): Likewise.
(vhcaddq_rot90_m): Likewise.
(vhsubq_m): Likewise.
(vhsubq_m_n): Likewise.
(vmulq_m_n): Likewise.
(vmulq_m): Likewise.
(vqaddq_m_n): Likewise.
(vqaddq_m): Likewise.
(vqdmulhq_m_n): Likewise.
(vqdmulhq_m): Likewise.
(vsubq_m_n): Likewise.
(vsliq_m_n): Likewise.
(vqsubq_m_n): Likewise.
(vqsubq_m): Likewise.
(vqrdmulhq_m): Likewise.
(vqrdmulhq_m_n): Likewise.
(vqrdmlsdhxq_m): Likewise.
(vqrdmlsdhq_m): Likewise.
(vqrdmladhq_m): Likewise.
(vqrdmladhxq_m): Likewise.
(vmlsdavaxq_p): Likewise.
(vmlsdavaq_p): Likewise.
(vmladavaxq_p): Likewise.
* config/arm/arm_mve_builtins.def (QUADOP_NONE_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (VHSUBQ_M): Define iterators.
(VSLIQ_M_N): Likewise.
(VQRDMLAHQ_M_N): Likewise.
(VRSHLQ_M): Likewise.
(VMINQ_M): Likewise.
(VMULLBQ_INT_M): Likewise.
(VMULHQ_M): Likewise.
(VMULQ_M): Likewise.
(VHSUBQ_M_N): Likewise.
(VHADDQ_M_N): Likewise.
(VORRQ_M): Likewise.
(VRMULHQ_M): Likewise.
(VQADDQ_M): Likewise.
(VRSHRQ_M_N): Likewise.
(VQSUBQ_M_N): Likewise.
(VADDQ_M): Likewise.
(VORNQ_M): Likewise.
(VQDMLAHQ_M_N): Likewise.
(VRHADDQ_M): Likewise.
(VQSHLQ_M): Likewise.
(VANDQ_M): Likewise.
(VBICQ_M): Likewise.
(VSHLQ_M_N): Likewise.
(VCADDQ_ROT270_M): Likewise.
(VQRSHLQ_M): Likewise.
(VQADDQ_M_N): Likewise.
(VADDQ_M_N): Likewise.
(VMAXQ_M): Likewise.
(VQSUBQ_M): Likewise.
(VMLASQ_M_N): Likewise.
(VMLADAVAQ_P): Likewise.
(VBRSRQ_M_N): Likewise.
(VMULQ_M_N): Likewise.
(VCADDQ_ROT90_M): Likewise.
(VMULLTQ_INT_M): Likewise.
(VEORQ_M): Likewise.
(VSHRQ_M_N): Likewise.
(VSUBQ_M_N): Likewise.
(VHADDQ_M): Likewise.
(VABDQ_M): Likewise.
(VQRDMLASHQ_M_N): Likewise.
(VMLAQ_M_N): Likewise.
(VQSHLQ_M_N): Likewise.
(mve_vabdq_m_<supf><mode>): Define RTL pattern.
(mve_vaddq_m_n_<supf><mode>): Likewise.
(mve_vaddq_m_<supf><mode>): Likewise.
(mve_vandq_m_<supf><mode>): Likewise.
(mve_vbicq_m_<supf><mode>): Likewise.
(mve_vbrsrq_m_n_<supf><mode>): Likewise.
(mve_vcaddq_rot270_m_<supf><mode>): Likewise.
(mve_vcaddq_rot90_m_<supf><mode>): Likewise.
(mve_veorq_m_<supf><mode>): Likewise.
(mve_vhaddq_m_n_<supf><mode>): Likewise.
(mve_vhaddq_m_<supf><mode>): Likewise.
(mve_vhsubq_m_n_<supf><mode>): Likewise.
(mve_vhsubq_m_<supf><mode>): Likewise.
(mve_vmaxq_m_<supf><mode>): Likewise.
(mve_vminq_m_<supf><mode>): Likewise.
(mve_vmladavaq_p_<supf><mode>): Likewise.
(mve_vmlaq_m_n_<supf><mode>): Likewise.
(mve_vmlasq_m_n_<supf><mode>): Likewise.
(mve_vmulhq_m_<supf><mode>): Likewise.
(mve_vmullbq_int_m_<supf><mode>): Likewise.
(mve_vmulltq_int_m_<supf><mode>): Likewise.
(mve_vmulq_m_n_<supf><mode>): Likewise.
(mve_vmulq_m_<supf><mode>): Likewise.
(mve_vornq_m_<supf><mode>): Likewise.
(mve_vorrq_m_<supf><mode>): Likewise.
(mve_vqaddq_m_n_<supf><mode>): Likewise.
(mve_vqaddq_m_<supf><mode>): Likewise.
(mve_vqdmlahq_m_n_<supf><mode>): Likewise.
(mve_vqrdmlahq_m_n_<supf><mode>): Likewise.
(mve_vqrdmlashq_m_n_<supf><mode>): Likewise.
(mve_vqrshlq_m_<supf><mode>): Likewise.
(mve_vqshlq_m_n_<supf><mode>): Likewise.
(mve_vqshlq_m_<supf><mode>): Likewise.
(mve_vqsubq_m_n_<supf><mode>): Likewise.
(mve_vqsubq_m_<supf><mode>): Likewise.
(mve_vrhaddq_m_<supf><mode>): Likewise.
(mve_vrmulhq_m_<supf><mode>): Likewise.
(mve_vrshlq_m_<supf><mode>): Likewise.
(mve_vrshrq_m_n_<supf><mode>): Likewise.
(mve_vshlq_m_n_<supf><mode>): Likewise.
(mve_vshrq_m_n_<supf><mode>): Likewise.
(mve_vsliq_m_n_<supf><mode>): Likewise.
(mve_vsubq_m_n_<supf><mode>): Likewise.
(mve_vhcaddq_rot270_m_s<mode>): Likewise.
(mve_vhcaddq_rot90_m_s<mode>): Likewise.
(mve_vmladavaxq_p_s<mode>): Likewise.
(mve_vmlsdavaq_p_s<mode>): Likewise.
(mve_vmlsdavaxq_p_s<mode>): Likewise.
(mve_vqdmladhq_m_s<mode>): Likewise.
(mve_vqdmladhxq_m_s<mode>): Likewise.
(mve_vqdmlsdhq_m_s<mode>): Likewise.
(mve_vqdmlsdhxq_m_s<mode>): Likewise.
(mve_vqdmulhq_m_n_s<mode>): Likewise.
(mve_vqdmulhq_m_s<mode>): Likewise.
(mve_vqrdmladhq_m_s<mode>): Likewise.
(mve_vqrdmladhxq_m_s<mode>): Likewise.
(mve_vqrdmlsdhq_m_s<mode>): Likewise.
(mve_vqrdmlsdhxq_m_s<mode>): Likewise.
(mve_vqrdmulhq_m_n_s<mode>): Likewise.
(mve_vqrdmulhq_m_s<mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabdq_m_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabdq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabdq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vandq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbrsrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot270_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcaddq_rot90_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/veorq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot270_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhcaddq_rot90_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vhsubq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulhq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmullbq_int_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulltq_int_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmulq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vornq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqsubq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrhaddq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmulhq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_n_u8.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:47:31 +0000 (16:47 +0000)]
[ARM][GCC][1/4x]: MVE intrinsics with quaternary operands.
This patch supports following MVE ACLE intrinsics with quaternary operands.
vsriq_m_n_s8, vsubq_m_s8, vsubq_x_s8, vcvtq_m_n_f16_u16, vcvtq_x_n_f16_u16,
vqshluq_m_n_s8, vabavq_p_s8, vsriq_m_n_u8, vshlq_m_u8, vshlq_x_u8, vsubq_m_u8,
vsubq_x_u8, vabavq_p_u8, vshlq_m_s8, vshlq_x_s8, vcvtq_m_n_f16_s16,
vcvtq_x_n_f16_s16, vsriq_m_n_s16, vsubq_m_s16, vsubq_x_s16, vcvtq_m_n_f32_u32,
vcvtq_x_n_f32_u32, vqshluq_m_n_s16, vabavq_p_s16, vsriq_m_n_u16,
vshlq_m_u16, vshlq_x_u16, vsubq_m_u16, vsubq_x_u16, vabavq_p_u16, vshlq_m_s16,
vshlq_x_s16, vcvtq_m_n_f32_s32, vcvtq_x_n_f32_s32, vsriq_m_n_s32, vsubq_m_s32,
vsubq_x_s32, vqshluq_m_n_s32, vabavq_p_s32, vsriq_m_n_u32, vshlq_m_u32,
vshlq_x_u32, vsubq_m_u32, vsubq_x_u32, vabavq_p_u32, vshlq_m_s32, vshlq_x_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (QUADOP_UNONE_UNONE_NONE_NONE_UNONE_QUALIFIERS):
Define builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vsriq_m_n_s8): Define macro.
(vsubq_m_s8): Likewise.
(vcvtq_m_n_f16_u16): Likewise.
(vqshluq_m_n_s8): Likewise.
(vabavq_p_s8): Likewise.
(vsriq_m_n_u8): Likewise.
(vshlq_m_u8): Likewise.
(vsubq_m_u8): Likewise.
(vabavq_p_u8): Likewise.
(vshlq_m_s8): Likewise.
(vcvtq_m_n_f16_s16): Likewise.
(vsriq_m_n_s16): Likewise.
(vsubq_m_s16): Likewise.
(vcvtq_m_n_f32_u32): Likewise.
(vqshluq_m_n_s16): Likewise.
(vabavq_p_s16): Likewise.
(vsriq_m_n_u16): Likewise.
(vshlq_m_u16): Likewise.
(vsubq_m_u16): Likewise.
(vabavq_p_u16): Likewise.
(vshlq_m_s16): Likewise.
(vcvtq_m_n_f32_s32): Likewise.
(vsriq_m_n_s32): Likewise.
(vsubq_m_s32): Likewise.
(vqshluq_m_n_s32): Likewise.
(vabavq_p_s32): Likewise.
(vsriq_m_n_u32): Likewise.
(vshlq_m_u32): Likewise.
(vsubq_m_u32): Likewise.
(vabavq_p_u32): Likewise.
(vshlq_m_s32): Likewise.
(__arm_vsriq_m_n_s8): Define intrinsic.
(__arm_vsubq_m_s8): Likewise.
(__arm_vqshluq_m_n_s8): Likewise.
(__arm_vabavq_p_s8): Likewise.
(__arm_vsriq_m_n_u8): Likewise.
(__arm_vshlq_m_u8): Likewise.
(__arm_vsubq_m_u8): Likewise.
(__arm_vabavq_p_u8): Likewise.
(__arm_vshlq_m_s8): Likewise.
(__arm_vsriq_m_n_s16): Likewise.
(__arm_vsubq_m_s16): Likewise.
(__arm_vqshluq_m_n_s16): Likewise.
(__arm_vabavq_p_s16): Likewise.
(__arm_vsriq_m_n_u16): Likewise.
(__arm_vshlq_m_u16): Likewise.
(__arm_vsubq_m_u16): Likewise.
(__arm_vabavq_p_u16): Likewise.
(__arm_vshlq_m_s16): Likewise.
(__arm_vsriq_m_n_s32): Likewise.
(__arm_vsubq_m_s32): Likewise.
(__arm_vqshluq_m_n_s32): Likewise.
(__arm_vabavq_p_s32): Likewise.
(__arm_vsriq_m_n_u32): Likewise.
(__arm_vshlq_m_u32): Likewise.
(__arm_vsubq_m_u32): Likewise.
(__arm_vabavq_p_u32): Likewise.
(__arm_vshlq_m_s32): Likewise.
(__arm_vcvtq_m_n_f16_u16): Likewise.
(__arm_vcvtq_m_n_f16_s16): Likewise.
(__arm_vcvtq_m_n_f32_u32): Likewise.
(__arm_vcvtq_m_n_f32_s32): Likewise.
(vcvtq_m_n): Define polymorphic variant.
(vqshluq_m_n): Likewise.
(vshlq_m): Likewise.
(vsriq_m_n): Likewise.
(vsubq_m): Likewise.
(vabavq_p): Likewise.
* config/arm/arm_mve_builtins.def
(QUADOP_UNONE_UNONE_NONE_NONE_UNONE_QUALIFIERS): Use builtin qualifier.
(QUADOP_NONE_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_NONE_NONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(QUADOP_UNONE_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
* config/arm/mve.md (VABAVQ_P): Define iterator.
(VSHLQ_M): Likewise.
(VSRIQ_M_N): Likewise.
(VSUBQ_M): Likewise.
(VCVTQ_M_N_TO_F): Likewise.
(mve_vabavq_p_<supf><mode>): Define RTL pattern.
(mve_vqshluq_m_n_s<mode>): Likewise.
(mve_vshlq_m_<supf><mode>): Likewise.
(mve_vsriq_m_n_<supf><mode>): Likewise.
(mve_vsubq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_n_to_f_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabavq_p_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_n_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshluq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsubq_m_u8.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:37:18 +0000 (16:37 +0000)]
[ARM][GCC][3/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vrmlaldavhaxq_s32, vrmlsldavhaq_s32, vrmlsldavhaxq_s32, vaddlvaq_p_s32, vcvtbq_m_f16_f32, vcvtbq_m_f32_f16, vcvttq_m_f16_f32, vcvttq_m_f32_f16, vrev16q_m_s8, vrev32q_m_f16, vrmlaldavhq_p_s32, vrmlaldavhxq_p_s32, vrmlsldavhq_p_s32, vrmlsldavhxq_p_s32, vaddlvaq_p_u32, vrev16q_m_u8, vrmlaldavhq_p_u32, vmvnq_m_n_s16, vorrq_m_n_s16, vqrshrntq_n_s16, vqshrnbq_n_s16, vqshrntq_n_s16, vrshrnbq_n_s16, vrshrntq_n_s16, vshrnbq_n_s16, vshrntq_n_s16, vcmlaq_f16, vcmlaq_rot180_f16, vcmlaq_rot270_f16, vcmlaq_rot90_f16, vfmaq_f16, vfmaq_n_f16, vfmasq_n_f16, vfmsq_f16, vmlaldavaq_s16, vmlaldavaxq_s16, vmlsldavaq_s16, vmlsldavaxq_s16, vabsq_m_f16, vcvtmq_m_s16_f16, vcvtnq_m_s16_f16, vcvtpq_m_s16_f16, vcvtq_m_s16_f16, vdupq_m_n_f16, vmaxnmaq_m_f16, vmaxnmavq_p_f16, vmaxnmvq_p_f16, vminnmaq_m_f16, vminnmavq_p_f16, vminnmvq_p_f16, vmlaldavq_p_s16, vmlaldavxq_p_s16, vmlsldavq_p_s16, vmlsldavxq_p_s16, vmovlbq_m_s8, vmovltq_m_s8, vmovnbq_m_s16, vmovntq_m_s16, vnegq_m_f16, vpselq_f16, vqmovnbq_m_s16, vqmovntq_m_s16, vrev32q_m_s8, vrev64q_m_f16, vrndaq_m_f16, vrndmq_m_f16, vrndnq_m_f16, vrndpq_m_f16, vrndq_m_f16, vrndxq_m_f16, vcmpeqq_m_n_f16, vcmpgeq_m_f16, vcmpgeq_m_n_f16, vcmpgtq_m_f16, vcmpgtq_m_n_f16, vcmpleq_m_f16, vcmpleq_m_n_f16, vcmpltq_m_f16, vcmpltq_m_n_f16, vcmpneq_m_f16, vcmpneq_m_n_f16, vmvnq_m_n_u16, vorrq_m_n_u16, vqrshruntq_n_s16, vqshrunbq_n_s16, vqshruntq_n_s16, vcvtmq_m_u16_f16, vcvtnq_m_u16_f16, vcvtpq_m_u16_f16, vcvtq_m_u16_f16, vqmovunbq_m_s16, vqmovuntq_m_s16, vqrshrntq_n_u16, vqshrnbq_n_u16, vqshrntq_n_u16, vrshrnbq_n_u16, vrshrntq_n_u16, vshrnbq_n_u16, vshrntq_n_u16, vmlaldavaq_u16, vmlaldavaxq_u16, vmlaldavq_p_u16, vmlaldavxq_p_u16, vmovlbq_m_u8, vmovltq_m_u8, vmovnbq_m_u16, vmovntq_m_u16, vqmovnbq_m_u16, vqmovntq_m_u16, vrev32q_m_u8, vmvnq_m_n_s32, vorrq_m_n_s32, vqrshrntq_n_s32, vqshrnbq_n_s32, vqshrntq_n_s32, vrshrnbq_n_s32, vrshrntq_n_s32, vshrnbq_n_s32, vshrntq_n_s32, vcmlaq_f32, vcmlaq_rot180_f32, vcmlaq_rot270_f32, vcmlaq_rot90_f32, vfmaq_f32, vfmaq_n_f32, vfmasq_n_f32, vfmsq_f32, vmlaldavaq_s32, vmlaldavaxq_s32, vmlsldavaq_s32, vmlsldavaxq_s32, vabsq_m_f32, vcvtmq_m_s32_f32, vcvtnq_m_s32_f32, vcvtpq_m_s32_f32, vcvtq_m_s32_f32, vdupq_m_n_f32, vmaxnmaq_m_f32, vmaxnmavq_p_f32, vmaxnmvq_p_f32, vminnmaq_m_f32, vminnmavq_p_f32, vminnmvq_p_f32, vmlaldavq_p_s32, vmlaldavxq_p_s32, vmlsldavq_p_s32, vmlsldavxq_p_s32, vmovlbq_m_s16, vmovltq_m_s16, vmovnbq_m_s32, vmovntq_m_s32, vnegq_m_f32, vpselq_f32, vqmovnbq_m_s32, vqmovntq_m_s32, vrev32q_m_s16, vrev64q_m_f32, vrndaq_m_f32, vrndmq_m_f32, vrndnq_m_f32, vrndpq_m_f32, vrndq_m_f32, vrndxq_m_f32, vcmpeqq_m_n_f32, vcmpgeq_m_f32, vcmpgeq_m_n_f32, vcmpgtq_m_f32, vcmpgtq_m_n_f32, vcmpleq_m_f32, vcmpleq_m_n_f32, vcmpltq_m_f32, vcmpltq_m_n_f32, vcmpneq_m_f32, vcmpneq_m_n_f32, vmvnq_m_n_u32, vorrq_m_n_u32, vqrshruntq_n_s32, vqshrunbq_n_s32, vqshruntq_n_s32, vcvtmq_m_u32_f32, vcvtnq_m_u32_f32, vcvtpq_m_u32_f32, vcvtq_m_u32_f32, vqmovunbq_m_s32, vqmovuntq_m_s32, vqrshrntq_n_u32, vqshrnbq_n_u32, vqshrntq_n_u32, vrshrnbq_n_u32, vrshrntq_n_u32, vshrnbq_n_u32, vshrntq_n_u32, vmlaldavaq_u32, vmlaldavaxq_u32, vmlaldavq_p_u32, vmlaldavxq_p_u32, vmovlbq_m_u16, vmovltq_m_u16, vmovnbq_m_u32, vmovntq_m_u32, vqmovnbq_m_u32, vqmovntq_m_u32, vrev32q_m_u16.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vrmlaldavhaxq_s32): Define macro.
(vrmlsldavhaq_s32): Likewise.
(vrmlsldavhaxq_s32): Likewise.
(vaddlvaq_p_s32): Likewise.
(vcvtbq_m_f16_f32): Likewise.
(vcvtbq_m_f32_f16): Likewise.
(vcvttq_m_f16_f32): Likewise.
(vcvttq_m_f32_f16): Likewise.
(vrev16q_m_s8): Likewise.
(vrev32q_m_f16): Likewise.
(vrmlaldavhq_p_s32): Likewise.
(vrmlaldavhxq_p_s32): Likewise.
(vrmlsldavhq_p_s32): Likewise.
(vrmlsldavhxq_p_s32): Likewise.
(vaddlvaq_p_u32): Likewise.
(vrev16q_m_u8): Likewise.
(vrmlaldavhq_p_u32): Likewise.
(vmvnq_m_n_s16): Likewise.
(vorrq_m_n_s16): Likewise.
(vqrshrntq_n_s16): Likewise.
(vqshrnbq_n_s16): Likewise.
(vqshrntq_n_s16): Likewise.
(vrshrnbq_n_s16): Likewise.
(vrshrntq_n_s16): Likewise.
(vshrnbq_n_s16): Likewise.
(vshrntq_n_s16): Likewise.
(vcmlaq_f16): Likewise.
(vcmlaq_rot180_f16): Likewise.
(vcmlaq_rot270_f16): Likewise.
(vcmlaq_rot90_f16): Likewise.
(vfmaq_f16): Likewise.
(vfmaq_n_f16): Likewise.
(vfmasq_n_f16): Likewise.
(vfmsq_f16): Likewise.
(vmlaldavaq_s16): Likewise.
(vmlaldavaxq_s16): Likewise.
(vmlsldavaq_s16): Likewise.
(vmlsldavaxq_s16): Likewise.
(vabsq_m_f16): Likewise.
(vcvtmq_m_s16_f16): Likewise.
(vcvtnq_m_s16_f16): Likewise.
(vcvtpq_m_s16_f16): Likewise.
(vcvtq_m_s16_f16): Likewise.
(vdupq_m_n_f16): Likewise.
(vmaxnmaq_m_f16): Likewise.
(vmaxnmavq_p_f16): Likewise.
(vmaxnmvq_p_f16): Likewise.
(vminnmaq_m_f16): Likewise.
(vminnmavq_p_f16): Likewise.
(vminnmvq_p_f16): Likewise.
(vmlaldavq_p_s16): Likewise.
(vmlaldavxq_p_s16): Likewise.
(vmlsldavq_p_s16): Likewise.
(vmlsldavxq_p_s16): Likewise.
(vmovlbq_m_s8): Likewise.
(vmovltq_m_s8): Likewise.
(vmovnbq_m_s16): Likewise.
(vmovntq_m_s16): Likewise.
(vnegq_m_f16): Likewise.
(vpselq_f16): Likewise.
(vqmovnbq_m_s16): Likewise.
(vqmovntq_m_s16): Likewise.
(vrev32q_m_s8): Likewise.
(vrev64q_m_f16): Likewise.
(vrndaq_m_f16): Likewise.
(vrndmq_m_f16): Likewise.
(vrndnq_m_f16): Likewise.
(vrndpq_m_f16): Likewise.
(vrndq_m_f16): Likewise.
(vrndxq_m_f16): Likewise.
(vcmpeqq_m_n_f16): Likewise.
(vcmpgeq_m_f16): Likewise.
(vcmpgeq_m_n_f16): Likewise.
(vcmpgtq_m_f16): Likewise.
(vcmpgtq_m_n_f16): Likewise.
(vcmpleq_m_f16): Likewise.
(vcmpleq_m_n_f16): Likewise.
(vcmpltq_m_f16): Likewise.
(vcmpltq_m_n_f16): Likewise.
(vcmpneq_m_f16): Likewise.
(vcmpneq_m_n_f16): Likewise.
(vmvnq_m_n_u16): Likewise.
(vorrq_m_n_u16): Likewise.
(vqrshruntq_n_s16): Likewise.
(vqshrunbq_n_s16): Likewise.
(vqshruntq_n_s16): Likewise.
(vcvtmq_m_u16_f16): Likewise.
(vcvtnq_m_u16_f16): Likewise.
(vcvtpq_m_u16_f16): Likewise.
(vcvtq_m_u16_f16): Likewise.
(vqmovunbq_m_s16): Likewise.
(vqmovuntq_m_s16): Likewise.
(vqrshrntq_n_u16): Likewise.
(vqshrnbq_n_u16): Likewise.
(vqshrntq_n_u16): Likewise.
(vrshrnbq_n_u16): Likewise.
(vrshrntq_n_u16): Likewise.
(vshrnbq_n_u16): Likewise.
(vshrntq_n_u16): Likewise.
(vmlaldavaq_u16): Likewise.
(vmlaldavaxq_u16): Likewise.
(vmlaldavq_p_u16): Likewise.
(vmlaldavxq_p_u16): Likewise.
(vmovlbq_m_u8): Likewise.
(vmovltq_m_u8): Likewise.
(vmovnbq_m_u16): Likewise.
(vmovntq_m_u16): Likewise.
(vqmovnbq_m_u16): Likewise.
(vqmovntq_m_u16): Likewise.
(vrev32q_m_u8): Likewise.
(vmvnq_m_n_s32): Likewise.
(vorrq_m_n_s32): Likewise.
(vqrshrntq_n_s32): Likewise.
(vqshrnbq_n_s32): Likewise.
(vqshrntq_n_s32): Likewise.
(vrshrnbq_n_s32): Likewise.
(vrshrntq_n_s32): Likewise.
(vshrnbq_n_s32): Likewise.
(vshrntq_n_s32): Likewise.
(vcmlaq_f32): Likewise.
(vcmlaq_rot180_f32): Likewise.
(vcmlaq_rot270_f32): Likewise.
(vcmlaq_rot90_f32): Likewise.
(vfmaq_f32): Likewise.
(vfmaq_n_f32): Likewise.
(vfmasq_n_f32): Likewise.
(vfmsq_f32): Likewise.
(vmlaldavaq_s32): Likewise.
(vmlaldavaxq_s32): Likewise.
(vmlsldavaq_s32): Likewise.
(vmlsldavaxq_s32): Likewise.
(vabsq_m_f32): Likewise.
(vcvtmq_m_s32_f32): Likewise.
(vcvtnq_m_s32_f32): Likewise.
(vcvtpq_m_s32_f32): Likewise.
(vcvtq_m_s32_f32): Likewise.
(vdupq_m_n_f32): Likewise.
(vmaxnmaq_m_f32): Likewise.
(vmaxnmavq_p_f32): Likewise.
(vmaxnmvq_p_f32): Likewise.
(vminnmaq_m_f32): Likewise.
(vminnmavq_p_f32): Likewise.
(vminnmvq_p_f32): Likewise.
(vmlaldavq_p_s32): Likewise.
(vmlaldavxq_p_s32): Likewise.
(vmlsldavq_p_s32): Likewise.
(vmlsldavxq_p_s32): Likewise.
(vmovlbq_m_s16): Likewise.
(vmovltq_m_s16): Likewise.
(vmovnbq_m_s32): Likewise.
(vmovntq_m_s32): Likewise.
(vnegq_m_f32): Likewise.
(vpselq_f32): Likewise.
(vqmovnbq_m_s32): Likewise.
(vqmovntq_m_s32): Likewise.
(vrev32q_m_s16): Likewise.
(vrev64q_m_f32): Likewise.
(vrndaq_m_f32): Likewise.
(vrndmq_m_f32): Likewise.
(vrndnq_m_f32): Likewise.
(vrndpq_m_f32): Likewise.
(vrndq_m_f32): Likewise.
(vrndxq_m_f32): Likewise.
(vcmpeqq_m_n_f32): Likewise.
(vcmpgeq_m_f32): Likewise.
(vcmpgeq_m_n_f32): Likewise.
(vcmpgtq_m_f32): Likewise.
(vcmpgtq_m_n_f32): Likewise.
(vcmpleq_m_f32): Likewise.
(vcmpleq_m_n_f32): Likewise.
(vcmpltq_m_f32): Likewise.
(vcmpltq_m_n_f32): Likewise.
(vcmpneq_m_f32): Likewise.
(vcmpneq_m_n_f32): Likewise.
(vmvnq_m_n_u32): Likewise.
(vorrq_m_n_u32): Likewise.
(vqrshruntq_n_s32): Likewise.
(vqshrunbq_n_s32): Likewise.
(vqshruntq_n_s32): Likewise.
(vcvtmq_m_u32_f32): Likewise.
(vcvtnq_m_u32_f32): Likewise.
(vcvtpq_m_u32_f32): Likewise.
(vcvtq_m_u32_f32): Likewise.
(vqmovunbq_m_s32): Likewise.
(vqmovuntq_m_s32): Likewise.
(vqrshrntq_n_u32): Likewise.
(vqshrnbq_n_u32): Likewise.
(vqshrntq_n_u32): Likewise.
(vrshrnbq_n_u32): Likewise.
(vrshrntq_n_u32): Likewise.
(vshrnbq_n_u32): Likewise.
(vshrntq_n_u32): Likewise.
(vmlaldavaq_u32): Likewise.
(vmlaldavaxq_u32): Likewise.
(vmlaldavq_p_u32): Likewise.
(vmlaldavxq_p_u32): Likewise.
(vmovlbq_m_u16): Likewise.
(vmovltq_m_u16): Likewise.
(vmovnbq_m_u32): Likewise.
(vmovntq_m_u32): Likewise.
(vqmovnbq_m_u32): Likewise.
(vqmovntq_m_u32): Likewise.
(vrev32q_m_u16): Likewise.
(__arm_vrmlaldavhaxq_s32): Define intrinsic.
(__arm_vrmlsldavhaq_s32): Likewise.
(__arm_vrmlsldavhaxq_s32): Likewise.
(__arm_vaddlvaq_p_s32): Likewise.
(__arm_vrev16q_m_s8): Likewise.
(__arm_vrmlaldavhq_p_s32): Likewise.
(__arm_vrmlaldavhxq_p_s32): Likewise.
(__arm_vrmlsldavhq_p_s32): Likewise.
(__arm_vrmlsldavhxq_p_s32): Likewise.
(__arm_vaddlvaq_p_u32): Likewise.
(__arm_vrev16q_m_u8): Likewise.
(__arm_vrmlaldavhq_p_u32): Likewise.
(__arm_vmvnq_m_n_s16): Likewise.
(__arm_vorrq_m_n_s16): Likewise.
(__arm_vqrshrntq_n_s16): Likewise.
(__arm_vqshrnbq_n_s16): Likewise.
(__arm_vqshrntq_n_s16): Likewise.
(__arm_vrshrnbq_n_s16): Likewise.
(__arm_vrshrntq_n_s16): Likewise.
(__arm_vshrnbq_n_s16): Likewise.
(__arm_vshrntq_n_s16): Likewise.
(__arm_vmlaldavaq_s16): Likewise.
(__arm_vmlaldavaxq_s16): Likewise.
(__arm_vmlsldavaq_s16): Likewise.
(__arm_vmlsldavaxq_s16): Likewise.
(__arm_vmlaldavq_p_s16): Likewise.
(__arm_vmlaldavxq_p_s16): Likewise.
(__arm_vmlsldavq_p_s16): Likewise.
(__arm_vmlsldavxq_p_s16): Likewise.
(__arm_vmovlbq_m_s8): Likewise.
(__arm_vmovltq_m_s8): Likewise.
(__arm_vmovnbq_m_s16): Likewise.
(__arm_vmovntq_m_s16): Likewise.
(__arm_vqmovnbq_m_s16): Likewise.
(__arm_vqmovntq_m_s16): Likewise.
(__arm_vrev32q_m_s8): Likewise.
(__arm_vmvnq_m_n_u16): Likewise.
(__arm_vorrq_m_n_u16): Likewise.
(__arm_vqrshruntq_n_s16): Likewise.
(__arm_vqshrunbq_n_s16): Likewise.
(__arm_vqshruntq_n_s16): Likewise.
(__arm_vqmovunbq_m_s16): Likewise.
(__arm_vqmovuntq_m_s16): Likewise.
(__arm_vqrshrntq_n_u16): Likewise.
(__arm_vqshrnbq_n_u16): Likewise.
(__arm_vqshrntq_n_u16): Likewise.
(__arm_vrshrnbq_n_u16): Likewise.
(__arm_vrshrntq_n_u16): Likewise.
(__arm_vshrnbq_n_u16): Likewise.
(__arm_vshrntq_n_u16): Likewise.
(__arm_vmlaldavaq_u16): Likewise.
(__arm_vmlaldavaxq_u16): Likewise.
(__arm_vmlaldavq_p_u16): Likewise.
(__arm_vmlaldavxq_p_u16): Likewise.
(__arm_vmovlbq_m_u8): Likewise.
(__arm_vmovltq_m_u8): Likewise.
(__arm_vmovnbq_m_u16): Likewise.
(__arm_vmovntq_m_u16): Likewise.
(__arm_vqmovnbq_m_u16): Likewise.
(__arm_vqmovntq_m_u16): Likewise.
(__arm_vrev32q_m_u8): Likewise.
(__arm_vmvnq_m_n_s32): Likewise.
(__arm_vorrq_m_n_s32): Likewise.
(__arm_vqrshrntq_n_s32): Likewise.
(__arm_vqshrnbq_n_s32): Likewise.
(__arm_vqshrntq_n_s32): Likewise.
(__arm_vrshrnbq_n_s32): Likewise.
(__arm_vrshrntq_n_s32): Likewise.
(__arm_vshrnbq_n_s32): Likewise.
(__arm_vshrntq_n_s32): Likewise.
(__arm_vmlaldavaq_s32): Likewise.
(__arm_vmlaldavaxq_s32): Likewise.
(__arm_vmlsldavaq_s32): Likewise.
(__arm_vmlsldavaxq_s32): Likewise.
(__arm_vmlaldavq_p_s32): Likewise.
(__arm_vmlaldavxq_p_s32): Likewise.
(__arm_vmlsldavq_p_s32): Likewise.
(__arm_vmlsldavxq_p_s32): Likewise.
(__arm_vmovlbq_m_s16): Likewise.
(__arm_vmovltq_m_s16): Likewise.
(__arm_vmovnbq_m_s32): Likewise.
(__arm_vmovntq_m_s32): Likewise.
(__arm_vqmovnbq_m_s32): Likewise.
(__arm_vqmovntq_m_s32): Likewise.
(__arm_vrev32q_m_s16): Likewise.
(__arm_vmvnq_m_n_u32): Likewise.
(__arm_vorrq_m_n_u32): Likewise.
(__arm_vqrshruntq_n_s32): Likewise.
(__arm_vqshrunbq_n_s32): Likewise.
(__arm_vqshruntq_n_s32): Likewise.
(__arm_vqmovunbq_m_s32): Likewise.
(__arm_vqmovuntq_m_s32): Likewise.
(__arm_vqrshrntq_n_u32): Likewise.
(__arm_vqshrnbq_n_u32): Likewise.
(__arm_vqshrntq_n_u32): Likewise.
(__arm_vrshrnbq_n_u32): Likewise.
(__arm_vrshrntq_n_u32): Likewise.
(__arm_vshrnbq_n_u32): Likewise.
(__arm_vshrntq_n_u32): Likewise.
(__arm_vmlaldavaq_u32): Likewise.
(__arm_vmlaldavaxq_u32): Likewise.
(__arm_vmlaldavq_p_u32): Likewise.
(__arm_vmlaldavxq_p_u32): Likewise.
(__arm_vmovlbq_m_u16): Likewise.
(__arm_vmovltq_m_u16): Likewise.
(__arm_vmovnbq_m_u32): Likewise.
(__arm_vmovntq_m_u32): Likewise.
(__arm_vqmovnbq_m_u32): Likewise.
(__arm_vqmovntq_m_u32): Likewise.
(__arm_vrev32q_m_u16): Likewise.
(__arm_vcvtbq_m_f16_f32): Likewise.
(__arm_vcvtbq_m_f32_f16): Likewise.
(__arm_vcvttq_m_f16_f32): Likewise.
(__arm_vcvttq_m_f32_f16): Likewise.
(__arm_vrev32q_m_f16): Likewise.
(__arm_vcmlaq_f16): Likewise.
(__arm_vcmlaq_rot180_f16): Likewise.
(__arm_vcmlaq_rot270_f16): Likewise.
(__arm_vcmlaq_rot90_f16): Likewise.
(__arm_vfmaq_f16): Likewise.
(__arm_vfmaq_n_f16): Likewise.
(__arm_vfmasq_n_f16): Likewise.
(__arm_vfmsq_f16): Likewise.
(__arm_vabsq_m_f16): Likewise.
(__arm_vcvtmq_m_s16_f16): Likewise.
(__arm_vcvtnq_m_s16_f16): Likewise.
(__arm_vcvtpq_m_s16_f16): Likewise.
(__arm_vcvtq_m_s16_f16): Likewise.
(__arm_vdupq_m_n_f16): Likewise.
(__arm_vmaxnmaq_m_f16): Likewise.
(__arm_vmaxnmavq_p_f16): Likewise.
(__arm_vmaxnmvq_p_f16): Likewise.
(__arm_vminnmaq_m_f16): Likewise.
(__arm_vminnmavq_p_f16): Likewise.
(__arm_vminnmvq_p_f16): Likewise.
(__arm_vnegq_m_f16): Likewise.
(__arm_vpselq_f16): Likewise.
(__arm_vrev64q_m_f16): Likewise.
(__arm_vrndaq_m_f16): Likewise.
(__arm_vrndmq_m_f16): Likewise.
(__arm_vrndnq_m_f16): Likewise.
(__arm_vrndpq_m_f16): Likewise.
(__arm_vrndq_m_f16): Likewise.
(__arm_vrndxq_m_f16): Likewise.
(__arm_vcmpeqq_m_n_f16): Likewise.
(__arm_vcmpgeq_m_f16): Likewise.
(__arm_vcmpgeq_m_n_f16): Likewise.
(__arm_vcmpgtq_m_f16): Likewise.
(__arm_vcmpgtq_m_n_f16): Likewise.
(__arm_vcmpleq_m_f16): Likewise.
(__arm_vcmpleq_m_n_f16): Likewise.
(__arm_vcmpltq_m_f16): Likewise.
(__arm_vcmpltq_m_n_f16): Likewise.
(__arm_vcmpneq_m_f16): Likewise.
(__arm_vcmpneq_m_n_f16): Likewise.
(__arm_vcvtmq_m_u16_f16): Likewise.
(__arm_vcvtnq_m_u16_f16): Likewise.
(__arm_vcvtpq_m_u16_f16): Likewise.
(__arm_vcvtq_m_u16_f16): Likewise.
(__arm_vcmlaq_f32): Likewise.
(__arm_vcmlaq_rot180_f32): Likewise.
(__arm_vcmlaq_rot270_f32): Likewise.
(__arm_vcmlaq_rot90_f32): Likewise.
(__arm_vfmaq_f32): Likewise.
(__arm_vfmaq_n_f32): Likewise.
(__arm_vfmasq_n_f32): Likewise.
(__arm_vfmsq_f32): Likewise.
(__arm_vabsq_m_f32): Likewise.
(__arm_vcvtmq_m_s32_f32): Likewise.
(__arm_vcvtnq_m_s32_f32): Likewise.
(__arm_vcvtpq_m_s32_f32): Likewise.
(__arm_vcvtq_m_s32_f32): Likewise.
(__arm_vdupq_m_n_f32): Likewise.
(__arm_vmaxnmaq_m_f32): Likewise.
(__arm_vmaxnmavq_p_f32): Likewise.
(__arm_vmaxnmvq_p_f32): Likewise.
(__arm_vminnmaq_m_f32): Likewise.
(__arm_vminnmavq_p_f32): Likewise.
(__arm_vminnmvq_p_f32): Likewise.
(__arm_vnegq_m_f32): Likewise.
(__arm_vpselq_f32): Likewise.
(__arm_vrev64q_m_f32): Likewise.
(__arm_vrndaq_m_f32): Likewise.
(__arm_vrndmq_m_f32): Likewise.
(__arm_vrndnq_m_f32): Likewise.
(__arm_vrndpq_m_f32): Likewise.
(__arm_vrndq_m_f32): Likewise.
(__arm_vrndxq_m_f32): Likewise.
(__arm_vcmpeqq_m_n_f32): Likewise.
(__arm_vcmpgeq_m_f32): Likewise.
(__arm_vcmpgeq_m_n_f32): Likewise.
(__arm_vcmpgtq_m_f32): Likewise.
(__arm_vcmpgtq_m_n_f32): Likewise.
(__arm_vcmpleq_m_f32): Likewise.
(__arm_vcmpleq_m_n_f32): Likewise.
(__arm_vcmpltq_m_f32): Likewise.
(__arm_vcmpltq_m_n_f32): Likewise.
(__arm_vcmpneq_m_f32): Likewise.
(__arm_vcmpneq_m_n_f32): Likewise.
(__arm_vcvtmq_m_u32_f32): Likewise.
(__arm_vcvtnq_m_u32_f32): Likewise.
(__arm_vcvtpq_m_u32_f32): Likewise.
(__arm_vcvtq_m_u32_f32): Likewise.
(vcvtq_m): Define polymorphic variant.
(vabsq_m): Likewise.
(vcmlaq): Likewise.
(vcmlaq_rot180): Likewise.
(vcmlaq_rot270): Likewise.
(vcmlaq_rot90): Likewise.
(vcmpeqq_m_n): Likewise.
(vcmpgeq_m_n): Likewise.
(vrndxq_m): Likewise.
(vrndq_m): Likewise.
(vrndpq_m): Likewise.
(vcmpgtq_m_n): Likewise.
(vcmpgtq_m): Likewise.
(vcmpleq_m): Likewise.
(vcmpleq_m_n): Likewise.
(vcmpltq_m_n): Likewise.
(vcmpltq_m): Likewise.
(vcmpneq_m): Likewise.
(vcmpneq_m_n): Likewise.
(vcvtbq_m): Likewise.
(vcvttq_m): Likewise.
(vcvtmq_m): Likewise.
(vcvtnq_m): Likewise.
(vcvtpq_m): Likewise.
(vdupq_m_n): Likewise.
(vfmaq_n): Likewise.
(vfmaq): Likewise.
(vfmasq_n): Likewise.
(vfmsq): Likewise.
(vmaxnmaq_m): Likewise.
(vmaxnmavq_m): Likewise.
(vmaxnmvq_m): Likewise.
(vmaxnmavq_p): Likewise.
(vmaxnmvq_p): Likewise.
(vminnmaq_m): Likewise.
(vminnmavq_p): Likewise.
(vminnmvq_p): Likewise.
(vrndnq_m): Likewise.
(vrndaq_m): Likewise.
(vrndmq_m): Likewise.
(vrev64q_m): Likewise.
(vrev32q_m): Likewise.
(vpselq): Likewise.
(vnegq_m): Likewise.
(vcmpgeq_m): Likewise.
(vshrntq_n): Likewise.
(vrshrntq_n): Likewise.
(vmovlbq_m): Likewise.
(vmovnbq_m): Likewise.
(vmovntq_m): Likewise.
(vmvnq_m_n): Likewise.
(vmvnq_m): Likewise.
(vshrnbq_n): Likewise.
(vrshrnbq_n): Likewise.
(vqshruntq_n): Likewise.
(vrev16q_m): Likewise.
(vqshrunbq_n): Likewise.
(vqshrntq_n): Likewise.
(vqrshruntq_n): Likewise.
(vqrshrntq_n): Likewise.
(vqshrnbq_n): Likewise.
(vqmovuntq_m): Likewise.
(vqmovntq_m): Likewise.
(vqmovnbq_m): Likewise.
(vorrq_m_n): Likewise.
(vmovltq_m): Likewise.
(vqmovunbq_m): Likewise.
(vaddlvaq_p): Likewise.
(vmlaldavaq): Likewise.
(vmlaldavaxq): Likewise.
(vmlaldavq_p): Likewise.
(vmlaldavxq_p): Likewise.
(vmlsldavaq): Likewise.
(vmlsldavaxq): Likewise.
(vmlsldavq_p): Likewise.
(vmlsldavxq_p): Likewise.
(vrmlaldavhaxq): Likewise.
(vrmlaldavhq_p): Likewise.
(vrmlaldavhxq_p): Likewise.
(vrmlsldavhaq): Likewise.
(vrmlsldavhaxq): Likewise.
(vrmlsldavhq_p): Likewise.
(vrmlsldavhxq_p): Likewise.
* config/arm/arm_mve_builtins.def (TERNOP_NONE_NONE_IMM_UNONE): Use
builtin qualifier.
(TERNOP_NONE_NONE_NONE_IMM): Likewise.
(TERNOP_NONE_NONE_NONE_NONE): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_UNONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/mve.md (MVE_constraint3): Define mode attribute iterator.
(MVE_pred3): Likewise.
(MVE_constraint1): Likewise.
(MVE_pred1): Likewise.
(VMLALDAVQ_P): Define iterator.
(VQMOVNBQ_M): Likewise.
(VMOVLTQ_M): Likewise.
(VMOVNBQ_M): Likewise.
(VRSHRNTQ_N): Likewise.
(VORRQ_M_N): Likewise.
(VREV32Q_M): Likewise.
(VREV16Q_M): Likewise.
(VQRSHRNTQ_N): Likewise.
(VMOVNTQ_M): Likewise.
(VMOVLBQ_M): Likewise.
(VMLALDAVAQ): Likewise.
(VQSHRNBQ_N): Likewise.
(VSHRNBQ_N): Likewise.
(VRSHRNBQ_N): Likewise.
(VMLALDAVXQ_P): Likewise.
(VQMOVNTQ_M): Likewise.
(VMVNQ_M_N): Likewise.
(VQSHRNTQ_N): Likewise.
(VMLALDAVAXQ): Likewise.
(VSHRNTQ_N): Likewise.
(VCVTMQ_M): Likewise.
(VCVTNQ_M): Likewise.
(VCVTPQ_M): Likewise.
(VCVTQ_M_N_FROM_F): Likewise.
(VCVTQ_M_FROM_F): Likewise.
(VRMLALDAVHQ_P): Likewise.
(VADDLVAQ_P): Likewise.
(mve_vrndq_m_f<mode>): Define RTL pattern.
(mve_vabsq_m_f<mode>): Likewise.
(mve_vaddlvaq_p_<supf>v4si): Likewise.
(mve_vcmlaq_f<mode>): Likewise.
(mve_vcmlaq_rot180_f<mode>): Likewise.
(mve_vcmlaq_rot270_f<mode>): Likewise.
(mve_vcmlaq_rot90_f<mode>): Likewise.
(mve_vcmpeqq_m_n_f<mode>): Likewise.
(mve_vcmpgeq_m_f<mode>): Likewise.
(mve_vcmpgeq_m_n_f<mode>): Likewise.
(mve_vcmpgtq_m_f<mode>): Likewise.
(mve_vcmpgtq_m_n_f<mode>): Likewise.
(mve_vcmpleq_m_f<mode>): Likewise.
(mve_vcmpleq_m_n_f<mode>): Likewise.
(mve_vcmpltq_m_f<mode>): Likewise.
(mve_vcmpltq_m_n_f<mode>): Likewise.
(mve_vcmpneq_m_f<mode>): Likewise.
(mve_vcmpneq_m_n_f<mode>): Likewise.
(mve_vcvtbq_m_f16_f32v8hf): Likewise.
(mve_vcvtbq_m_f32_f16v4sf): Likewise.
(mve_vcvttq_m_f16_f32v8hf): Likewise.
(mve_vcvttq_m_f32_f16v4sf): Likewise.
(mve_vdupq_m_n_f<mode>): Likewise.
(mve_vfmaq_f<mode>): Likewise.
(mve_vfmaq_n_f<mode>): Likewise.
(mve_vfmasq_n_f<mode>): Likewise.
(mve_vfmsq_f<mode>): Likewise.
(mve_vmaxnmaq_m_f<mode>): Likewise.
(mve_vmaxnmavq_p_f<mode>): Likewise.
(mve_vmaxnmvq_p_f<mode>): Likewise.
(mve_vminnmaq_m_f<mode>): Likewise.
(mve_vminnmavq_p_f<mode>): Likewise.
(mve_vminnmvq_p_f<mode>): Likewise.
(mve_vmlaldavaq_<supf><mode>): Likewise.
(mve_vmlaldavaxq_<supf><mode>): Likewise.
(mve_vmlaldavq_p_<supf><mode>): Likewise.
(mve_vmlaldavxq_p_<supf><mode>): Likewise.
(mve_vmlsldavaq_s<mode>): Likewise.
(mve_vmlsldavaxq_s<mode>): Likewise.
(mve_vmlsldavq_p_s<mode>): Likewise.
(mve_vmlsldavxq_p_s<mode>): Likewise.
(mve_vmovlbq_m_<supf><mode>): Likewise.
(mve_vmovltq_m_<supf><mode>): Likewise.
(mve_vmovnbq_m_<supf><mode>): Likewise.
(mve_vmovntq_m_<supf><mode>): Likewise.
(mve_vmvnq_m_n_<supf><mode>): Likewise.
(mve_vnegq_m_f<mode>): Likewise.
(mve_vorrq_m_n_<supf><mode>): Likewise.
(mve_vpselq_f<mode>): Likewise.
(mve_vqmovnbq_m_<supf><mode>): Likewise.
(mve_vqmovntq_m_<supf><mode>): Likewise.
(mve_vqmovunbq_m_s<mode>): Likewise.
(mve_vqmovuntq_m_s<mode>): Likewise.
(mve_vqrshrntq_n_<supf><mode>): Likewise.
(mve_vqrshruntq_n_s<mode>): Likewise.
(mve_vqshrnbq_n_<supf><mode>): Likewise.
(mve_vqshrntq_n_<supf><mode>): Likewise.
(mve_vqshrunbq_n_s<mode>): Likewise.
(mve_vqshruntq_n_s<mode>): Likewise.
(mve_vrev32q_m_fv8hf): Likewise.
(mve_vrev32q_m_<supf><mode>): Likewise.
(mve_vrev64q_m_f<mode>): Likewise.
(mve_vrmlaldavhaxq_sv4si): Likewise.
(mve_vrmlaldavhxq_p_sv4si): Likewise.
(mve_vrmlsldavhaxq_sv4si): Likewise.
(mve_vrmlsldavhq_p_sv4si): Likewise.
(mve_vrmlsldavhxq_p_sv4si): Likewise.
(mve_vrndaq_m_f<mode>): Likewise.
(mve_vrndmq_m_f<mode>): Likewise.
(mve_vrndnq_m_f<mode>): Likewise.
(mve_vrndpq_m_f<mode>): Likewise.
(mve_vrndxq_m_f<mode>): Likewise.
(mve_vrshrnbq_n_<supf><mode>): Likewise.
(mve_vrshrntq_n_<supf><mode>): Likewise.
(mve_vshrnbq_n_<supf><mode>): Likewise.
(mve_vshrntq_n_<supf><mode>): Likewise.
(mve_vcvtmq_m_<supf><mode>): Likewise.
(mve_vcvtpq_m_<supf><mode>): Likewise.
(mve_vcvtnq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_n_from_f_<supf><mode>): Likewise.
(mve_vrev16q_m_<supf>v16qi): Likewise.
(mve_vcvtq_m_from_f_<supf><mode>): Likewise.
(mve_vrmlaldavhq_p_<supf>v4si): Likewise.
(mve_vrmlsldavhaq_sv4si): Likewise.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_m_f16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddlvaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot180_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot270_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmlaq_rot90_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_m_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtbq_m_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtmq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtnq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtpq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_m_f16_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvttq_m_f32_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmaq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_n_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmasq_n_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vfmsq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmavq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxnmvq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmavq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_p_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminnmvq_p_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavaxq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaldavxq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsldavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovlbq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovltq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovnbq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmovntq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vorrq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovnbq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovntq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovunbq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqmovuntq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshruntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshrunbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshruntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev16q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev32q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlsldavhxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndaq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndmq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndnq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndpq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrndxq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshrntq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshrntq_n_u32.c: Likewise.
Srinath Parvathaneni [Wed, 18 Mar 2020 16:26:53 +0000 (16:26 +0000)]
[ARM][GCC][2/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vpselq_u8, vpselq_s8, vrev64q_m_u8, vqrdmlashq_n_u8, vqrdmlahq_n_u8, vqdmlahq_n_u8, vmvnq_m_u8, vmlasq_n_u8, vmlaq_n_u8, vmladavq_p_u8, vmladavaq_u8, vminvq_p_u8, vmaxvq_p_u8, vdupq_m_n_u8, vcmpneq_m_u8, vcmpneq_m_n_u8, vcmphiq_m_u8, vcmphiq_m_n_u8, vcmpeqq_m_u8, vcmpeqq_m_n_u8, vcmpcsq_m_u8, vcmpcsq_m_n_u8, vclzq_m_u8, vaddvaq_p_u8, vsriq_n_u8, vsliq_n_u8, vshlq_m_r_u8, vrshlq_m_n_u8, vqshlq_m_r_u8, vqrshlq_m_n_u8, vminavq_p_s8, vminaq_m_s8, vmaxavq_p_s8, vmaxaq_m_s8, vcmpneq_m_s8, vcmpneq_m_n_s8, vcmpltq_m_s8, vcmpltq_m_n_s8, vcmpleq_m_s8, vcmpleq_m_n_s8, vcmpgtq_m_s8, vcmpgtq_m_n_s8, vcmpgeq_m_s8, vcmpgeq_m_n_s8, vcmpeqq_m_s8, vcmpeqq_m_n_s8, vshlq_m_r_s8, vrshlq_m_n_s8, vrev64q_m_s8, vqshlq_m_r_s8, vqrshlq_m_n_s8, vqnegq_m_s8, vqabsq_m_s8, vnegq_m_s8, vmvnq_m_s8, vmlsdavxq_p_s8, vmlsdavq_p_s8, vmladavxq_p_s8, vmladavq_p_s8, vminvq_p_s8, vmaxvq_p_s8, vdupq_m_n_s8, vclzq_m_s8, vclsq_m_s8, vaddvaq_p_s8, vabsq_m_s8, vqrdmlsdhxq_s8, vqrdmlsdhq_s8, vqrdmlashq_n_s8, vqrdmlahq_n_s8, vqrdmladhxq_s8, vqrdmladhq_s8, vqdmlsdhxq_s8, vqdmlsdhq_s8, vqdmlahq_n_s8, vqdmladhxq_s8, vqdmladhq_s8, vmlsdavaxq_s8, vmlsdavaq_s8, vmlasq_n_s8, vmlaq_n_s8, vmladavaxq_s8, vmladavaq_s8, vsriq_n_s8, vsliq_n_s8, vpselq_u16, vpselq_s16, vrev64q_m_u16, vqrdmlashq_n_u16, vqrdmlahq_n_u16, vqdmlahq_n_u16, vmvnq_m_u16, vmlasq_n_u16, vmlaq_n_u16, vmladavq_p_u16, vmladavaq_u16, vminvq_p_u16, vmaxvq_p_u16, vdupq_m_n_u16, vcmpneq_m_u16, vcmpneq_m_n_u16, vcmphiq_m_u16, vcmphiq_m_n_u16, vcmpeqq_m_u16, vcmpeqq_m_n_u16, vcmpcsq_m_u16, vcmpcsq_m_n_u16, vclzq_m_u16, vaddvaq_p_u16, vsriq_n_u16, vsliq_n_u16, vshlq_m_r_u16, vrshlq_m_n_u16, vqshlq_m_r_u16, vqrshlq_m_n_u16, vminavq_p_s16, vminaq_m_s16, vmaxavq_p_s16, vmaxaq_m_s16, vcmpneq_m_s16, vcmpneq_m_n_s16, vcmpltq_m_s16, vcmpltq_m_n_s16, vcmpleq_m_s16, vcmpleq_m_n_s16, vcmpgtq_m_s16, vcmpgtq_m_n_s16, vcmpgeq_m_s16, vcmpgeq_m_n_s16, vcmpeqq_m_s16, vcmpeqq_m_n_s16, vshlq_m_r_s16, vrshlq_m_n_s16, vrev64q_m_s16, vqshlq_m_r_s16, vqrshlq_m_n_s16, vqnegq_m_s16, vqabsq_m_s16, vnegq_m_s16, vmvnq_m_s16, vmlsdavxq_p_s16, vmlsdavq_p_s16, vmladavxq_p_s16, vmladavq_p_s16, vminvq_p_s16, vmaxvq_p_s16, vdupq_m_n_s16, vclzq_m_s16, vclsq_m_s16, vaddvaq_p_s16, vabsq_m_s16, vqrdmlsdhxq_s16, vqrdmlsdhq_s16, vqrdmlashq_n_s16, vqrdmlahq_n_s16, vqrdmladhxq_s16, vqrdmladhq_s16, vqdmlsdhxq_s16, vqdmlsdhq_s16, vqdmlahq_n_s16, vqdmladhxq_s16, vqdmladhq_s16, vmlsdavaxq_s16, vmlsdavaq_s16, vmlasq_n_s16, vmlaq_n_s16, vmladavaxq_s16, vmladavaq_s16, vsriq_n_s16, vsliq_n_s16, vpselq_u32, vpselq_s32, vrev64q_m_u32, vqrdmlashq_n_u32, vqrdmlahq_n_u32, vqdmlahq_n_u32, vmvnq_m_u32, vmlasq_n_u32, vmlaq_n_u32, vmladavq_p_u32, vmladavaq_u32, vminvq_p_u32, vmaxvq_p_u32, vdupq_m_n_u32, vcmpneq_m_u32, vcmpneq_m_n_u32, vcmphiq_m_u32, vcmphiq_m_n_u32, vcmpeqq_m_u32, vcmpeqq_m_n_u32, vcmpcsq_m_u32, vcmpcsq_m_n_u32, vclzq_m_u32, vaddvaq_p_u32, vsriq_n_u32, vsliq_n_u32, vshlq_m_r_u32, vrshlq_m_n_u32, vqshlq_m_r_u32, vqrshlq_m_n_u32, vminavq_p_s32, vminaq_m_s32, vmaxavq_p_s32, vmaxaq_m_s32, vcmpneq_m_s32, vcmpneq_m_n_s32, vcmpltq_m_s32, vcmpltq_m_n_s32, vcmpleq_m_s32, vcmpleq_m_n_s32, vcmpgtq_m_s32, vcmpgtq_m_n_s32, vcmpgeq_m_s32, vcmpgeq_m_n_s32, vcmpeqq_m_s32, vcmpeqq_m_n_s32, vshlq_m_r_s32, vrshlq_m_n_s32, vrev64q_m_s32, vqshlq_m_r_s32, vqrshlq_m_n_s32, vqnegq_m_s32, vqabsq_m_s32, vnegq_m_s32, vmvnq_m_s32, vmlsdavxq_p_s32, vmlsdavq_p_s32, vmladavxq_p_s32, vmladavq_p_s32, vminvq_p_s32, vmaxvq_p_s32, vdupq_m_n_s32, vclzq_m_s32, vclsq_m_s32, vaddvaq_p_s32, vabsq_m_s32, vqrdmlsdhxq_s32, vqrdmlsdhq_s32, vqrdmlashq_n_s32, vqrdmlahq_n_s32, vqrdmladhxq_s32, vqrdmladhq_s32, vqdmlsdhxq_s32, vqdmlsdhq_s32, vqdmlahq_n_s32, vqdmladhxq_s32, vqdmladhq_s32, vmlsdavaxq_s32, vmlsdavaq_s32, vmlasq_n_s32, vmlaq_n_s32, vmladavaxq_s32, vmladavaq_s32, vsriq_n_s32, vsliq_n_s32, vpselq_u64, vpselq_s64.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
In this patch new constraints "Rc" and "Re" are added, which checks the constant is with in the range of 0 to 15 and 0 to 31 respectively.
Also a new predicates "mve_imm_15" and "mve_imm_31" are added, to check the the matching constraint Rc and Re respectively.
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm_mve.h (vpselq_u8): Define macro.
(vpselq_s8): Likewise.
(vrev64q_m_u8): Likewise.
(vqrdmlashq_n_u8): Likewise.
(vqrdmlahq_n_u8): Likewise.
(vqdmlahq_n_u8): Likewise.
(vmvnq_m_u8): Likewise.
(vmlasq_n_u8): Likewise.
(vmlaq_n_u8): Likewise.
(vmladavq_p_u8): Likewise.
(vmladavaq_u8): Likewise.
(vminvq_p_u8): Likewise.
(vmaxvq_p_u8): Likewise.
(vdupq_m_n_u8): Likewise.
(vcmpneq_m_u8): Likewise.
(vcmpneq_m_n_u8): Likewise.
(vcmphiq_m_u8): Likewise.
(vcmphiq_m_n_u8): Likewise.
(vcmpeqq_m_u8): Likewise.
(vcmpeqq_m_n_u8): Likewise.
(vcmpcsq_m_u8): Likewise.
(vcmpcsq_m_n_u8): Likewise.
(vclzq_m_u8): Likewise.
(vaddvaq_p_u8): Likewise.
(vsriq_n_u8): Likewise.
(vsliq_n_u8): Likewise.
(vshlq_m_r_u8): Likewise.
(vrshlq_m_n_u8): Likewise.
(vqshlq_m_r_u8): Likewise.
(vqrshlq_m_n_u8): Likewise.
(vminavq_p_s8): Likewise.
(vminaq_m_s8): Likewise.
(vmaxavq_p_s8): Likewise.
(vmaxaq_m_s8): Likewise.
(vcmpneq_m_s8): Likewise.
(vcmpneq_m_n_s8): Likewise.
(vcmpltq_m_s8): Likewise.
(vcmpltq_m_n_s8): Likewise.
(vcmpleq_m_s8): Likewise.
(vcmpleq_m_n_s8): Likewise.
(vcmpgtq_m_s8): Likewise.
(vcmpgtq_m_n_s8): Likewise.
(vcmpgeq_m_s8): Likewise.
(vcmpgeq_m_n_s8): Likewise.
(vcmpeqq_m_s8): Likewise.
(vcmpeqq_m_n_s8): Likewise.
(vshlq_m_r_s8): Likewise.
(vrshlq_m_n_s8): Likewise.
(vrev64q_m_s8): Likewise.
(vqshlq_m_r_s8): Likewise.
(vqrshlq_m_n_s8): Likewise.
(vqnegq_m_s8): Likewise.
(vqabsq_m_s8): Likewise.
(vnegq_m_s8): Likewise.
(vmvnq_m_s8): Likewise.
(vmlsdavxq_p_s8): Likewise.
(vmlsdavq_p_s8): Likewise.
(vmladavxq_p_s8): Likewise.
(vmladavq_p_s8): Likewise.
(vminvq_p_s8): Likewise.
(vmaxvq_p_s8): Likewise.
(vdupq_m_n_s8): Likewise.
(vclzq_m_s8): Likewise.
(vclsq_m_s8): Likewise.
(vaddvaq_p_s8): Likewise.
(vabsq_m_s8): Likewise.
(vqrdmlsdhxq_s8): Likewise.
(vqrdmlsdhq_s8): Likewise.
(vqrdmlashq_n_s8): Likewise.
(vqrdmlahq_n_s8): Likewise.
(vqrdmladhxq_s8): Likewise.
(vqrdmladhq_s8): Likewise.
(vqdmlsdhxq_s8): Likewise.
(vqdmlsdhq_s8): Likewise.
(vqdmlahq_n_s8): Likewise.
(vqdmladhxq_s8): Likewise.
(vqdmladhq_s8): Likewise.
(vmlsdavaxq_s8): Likewise.
(vmlsdavaq_s8): Likewise.
(vmlasq_n_s8): Likewise.
(vmlaq_n_s8): Likewise.
(vmladavaxq_s8): Likewise.
(vmladavaq_s8): Likewise.
(vsriq_n_s8): Likewise.
(vsliq_n_s8): Likewise.
(vpselq_u16): Likewise.
(vpselq_s16): Likewise.
(vrev64q_m_u16): Likewise.
(vqrdmlashq_n_u16): Likewise.
(vqrdmlahq_n_u16): Likewise.
(vqdmlahq_n_u16): Likewise.
(vmvnq_m_u16): Likewise.
(vmlasq_n_u16): Likewise.
(vmlaq_n_u16): Likewise.
(vmladavq_p_u16): Likewise.
(vmladavaq_u16): Likewise.
(vminvq_p_u16): Likewise.
(vmaxvq_p_u16): Likewise.
(vdupq_m_n_u16): Likewise.
(vcmpneq_m_u16): Likewise.
(vcmpneq_m_n_u16): Likewise.
(vcmphiq_m_u16): Likewise.
(vcmphiq_m_n_u16): Likewise.
(vcmpeqq_m_u16): Likewise.
(vcmpeqq_m_n_u16): Likewise.
(vcmpcsq_m_u16): Likewise.
(vcmpcsq_m_n_u16): Likewise.
(vclzq_m_u16): Likewise.
(vaddvaq_p_u16): Likewise.
(vsriq_n_u16): Likewise.
(vsliq_n_u16): Likewise.
(vshlq_m_r_u16): Likewise.
(vrshlq_m_n_u16): Likewise.
(vqshlq_m_r_u16): Likewise.
(vqrshlq_m_n_u16): Likewise.
(vminavq_p_s16): Likewise.
(vminaq_m_s16): Likewise.
(vmaxavq_p_s16): Likewise.
(vmaxaq_m_s16): Likewise.
(vcmpneq_m_s16): Likewise.
(vcmpneq_m_n_s16): Likewise.
(vcmpltq_m_s16): Likewise.
(vcmpltq_m_n_s16): Likewise.
(vcmpleq_m_s16): Likewise.
(vcmpleq_m_n_s16): Likewise.
(vcmpgtq_m_s16): Likewise.
(vcmpgtq_m_n_s16): Likewise.
(vcmpgeq_m_s16): Likewise.
(vcmpgeq_m_n_s16): Likewise.
(vcmpeqq_m_s16): Likewise.
(vcmpeqq_m_n_s16): Likewise.
(vshlq_m_r_s16): Likewise.
(vrshlq_m_n_s16): Likewise.
(vrev64q_m_s16): Likewise.
(vqshlq_m_r_s16): Likewise.
(vqrshlq_m_n_s16): Likewise.
(vqnegq_m_s16): Likewise.
(vqabsq_m_s16): Likewise.
(vnegq_m_s16): Likewise.
(vmvnq_m_s16): Likewise.
(vmlsdavxq_p_s16): Likewise.
(vmlsdavq_p_s16): Likewise.
(vmladavxq_p_s16): Likewise.
(vmladavq_p_s16): Likewise.
(vminvq_p_s16): Likewise.
(vmaxvq_p_s16): Likewise.
(vdupq_m_n_s16): Likewise.
(vclzq_m_s16): Likewise.
(vclsq_m_s16): Likewise.
(vaddvaq_p_s16): Likewise.
(vabsq_m_s16): Likewise.
(vqrdmlsdhxq_s16): Likewise.
(vqrdmlsdhq_s16): Likewise.
(vqrdmlashq_n_s16): Likewise.
(vqrdmlahq_n_s16): Likewise.
(vqrdmladhxq_s16): Likewise.
(vqrdmladhq_s16): Likewise.
(vqdmlsdhxq_s16): Likewise.
(vqdmlsdhq_s16): Likewise.
(vqdmlahq_n_s16): Likewise.
(vqdmladhxq_s16): Likewise.
(vqdmladhq_s16): Likewise.
(vmlsdavaxq_s16): Likewise.
(vmlsdavaq_s16): Likewise.
(vmlasq_n_s16): Likewise.
(vmlaq_n_s16): Likewise.
(vmladavaxq_s16): Likewise.
(vmladavaq_s16): Likewise.
(vsriq_n_s16): Likewise.
(vsliq_n_s16): Likewise.
(vpselq_u32): Likewise.
(vpselq_s32): Likewise.
(vrev64q_m_u32): Likewise.
(vqrdmlashq_n_u32): Likewise.
(vqrdmlahq_n_u32): Likewise.
(vqdmlahq_n_u32): Likewise.
(vmvnq_m_u32): Likewise.
(vmlasq_n_u32): Likewise.
(vmlaq_n_u32): Likewise.
(vmladavq_p_u32): Likewise.
(vmladavaq_u32): Likewise.
(vminvq_p_u32): Likewise.
(vmaxvq_p_u32): Likewise.
(vdupq_m_n_u32): Likewise.
(vcmpneq_m_u32): Likewise.
(vcmpneq_m_n_u32): Likewise.
(vcmphiq_m_u32): Likewise.
(vcmphiq_m_n_u32): Likewise.
(vcmpeqq_m_u32): Likewise.
(vcmpeqq_m_n_u32): Likewise.
(vcmpcsq_m_u32): Likewise.
(vcmpcsq_m_n_u32): Likewise.
(vclzq_m_u32): Likewise.
(vaddvaq_p_u32): Likewise.
(vsriq_n_u32): Likewise.
(vsliq_n_u32): Likewise.
(vshlq_m_r_u32): Likewise.
(vrshlq_m_n_u32): Likewise.
(vqshlq_m_r_u32): Likewise.
(vqrshlq_m_n_u32): Likewise.
(vminavq_p_s32): Likewise.
(vminaq_m_s32): Likewise.
(vmaxavq_p_s32): Likewise.
(vmaxaq_m_s32): Likewise.
(vcmpneq_m_s32): Likewise.
(vcmpneq_m_n_s32): Likewise.
(vcmpltq_m_s32): Likewise.
(vcmpltq_m_n_s32): Likewise.
(vcmpleq_m_s32): Likewise.
(vcmpleq_m_n_s32): Likewise.
(vcmpgtq_m_s32): Likewise.
(vcmpgtq_m_n_s32): Likewise.
(vcmpgeq_m_s32): Likewise.
(vcmpgeq_m_n_s32): Likewise.
(vcmpeqq_m_s32): Likewise.
(vcmpeqq_m_n_s32): Likewise.
(vshlq_m_r_s32): Likewise.
(vrshlq_m_n_s32): Likewise.
(vrev64q_m_s32): Likewise.
(vqshlq_m_r_s32): Likewise.
(vqrshlq_m_n_s32): Likewise.
(vqnegq_m_s32): Likewise.
(vqabsq_m_s32): Likewise.
(vnegq_m_s32): Likewise.
(vmvnq_m_s32): Likewise.
(vmlsdavxq_p_s32): Likewise.
(vmlsdavq_p_s32): Likewise.
(vmladavxq_p_s32): Likewise.
(vmladavq_p_s32): Likewise.
(vminvq_p_s32): Likewise.
(vmaxvq_p_s32): Likewise.
(vdupq_m_n_s32): Likewise.
(vclzq_m_s32): Likewise.
(vclsq_m_s32): Likewise.
(vaddvaq_p_s32): Likewise.
(vabsq_m_s32): Likewise.
(vqrdmlsdhxq_s32): Likewise.
(vqrdmlsdhq_s32): Likewise.
(vqrdmlashq_n_s32): Likewise.
(vqrdmlahq_n_s32): Likewise.
(vqrdmladhxq_s32): Likewise.
(vqrdmladhq_s32): Likewise.
(vqdmlsdhxq_s32): Likewise.
(vqdmlsdhq_s32): Likewise.
(vqdmlahq_n_s32): Likewise.
(vqdmladhxq_s32): Likewise.
(vqdmladhq_s32): Likewise.
(vmlsdavaxq_s32): Likewise.
(vmlsdavaq_s32): Likewise.
(vmlasq_n_s32): Likewise.
(vmlaq_n_s32): Likewise.
(vmladavaxq_s32): Likewise.
(vmladavaq_s32): Likewise.
(vsriq_n_s32): Likewise.
(vsliq_n_s32): Likewise.
(vpselq_u64): Likewise.
(vpselq_s64): Likewise.
(__arm_vpselq_u8): Define intrinsic.
(__arm_vpselq_s8): Likewise.
(__arm_vrev64q_m_u8): Likewise.
(__arm_vqrdmlashq_n_u8): Likewise.
(__arm_vqrdmlahq_n_u8): Likewise.
(__arm_vqdmlahq_n_u8): Likewise.
(__arm_vmvnq_m_u8): Likewise.
(__arm_vmlasq_n_u8): Likewise.
(__arm_vmlaq_n_u8): Likewise.
(__arm_vmladavq_p_u8): Likewise.
(__arm_vmladavaq_u8): Likewise.
(__arm_vminvq_p_u8): Likewise.
(__arm_vmaxvq_p_u8): Likewise.
(__arm_vdupq_m_n_u8): Likewise.
(__arm_vcmpneq_m_u8): Likewise.
(__arm_vcmpneq_m_n_u8): Likewise.
(__arm_vcmphiq_m_u8): Likewise.
(__arm_vcmphiq_m_n_u8): Likewise.
(__arm_vcmpeqq_m_u8): Likewise.
(__arm_vcmpeqq_m_n_u8): Likewise.
(__arm_vcmpcsq_m_u8): Likewise.
(__arm_vcmpcsq_m_n_u8): Likewise.
(__arm_vclzq_m_u8): Likewise.
(__arm_vaddvaq_p_u8): Likewise.
(__arm_vsriq_n_u8): Likewise.
(__arm_vsliq_n_u8): Likewise.
(__arm_vshlq_m_r_u8): Likewise.
(__arm_vrshlq_m_n_u8): Likewise.
(__arm_vqshlq_m_r_u8): Likewise.
(__arm_vqrshlq_m_n_u8): Likewise.
(__arm_vminavq_p_s8): Likewise.
(__arm_vminaq_m_s8): Likewise.
(__arm_vmaxavq_p_s8): Likewise.
(__arm_vmaxaq_m_s8): Likewise.
(__arm_vcmpneq_m_s8): Likewise.
(__arm_vcmpneq_m_n_s8): Likewise.
(__arm_vcmpltq_m_s8): Likewise.
(__arm_vcmpltq_m_n_s8): Likewise.
(__arm_vcmpleq_m_s8): Likewise.
(__arm_vcmpleq_m_n_s8): Likewise.
(__arm_vcmpgtq_m_s8): Likewise.
(__arm_vcmpgtq_m_n_s8): Likewise.
(__arm_vcmpgeq_m_s8): Likewise.
(__arm_vcmpgeq_m_n_s8): Likewise.
(__arm_vcmpeqq_m_s8): Likewise.
(__arm_vcmpeqq_m_n_s8): Likewise.
(__arm_vshlq_m_r_s8): Likewise.
(__arm_vrshlq_m_n_s8): Likewise.
(__arm_vrev64q_m_s8): Likewise.
(__arm_vqshlq_m_r_s8): Likewise.
(__arm_vqrshlq_m_n_s8): Likewise.
(__arm_vqnegq_m_s8): Likewise.
(__arm_vqabsq_m_s8): Likewise.
(__arm_vnegq_m_s8): Likewise.
(__arm_vmvnq_m_s8): Likewise.
(__arm_vmlsdavxq_p_s8): Likewise.
(__arm_vmlsdavq_p_s8): Likewise.
(__arm_vmladavxq_p_s8): Likewise.
(__arm_vmladavq_p_s8): Likewise.
(__arm_vminvq_p_s8): Likewise.
(__arm_vmaxvq_p_s8): Likewise.
(__arm_vdupq_m_n_s8): Likewise.
(__arm_vclzq_m_s8): Likewise.
(__arm_vclsq_m_s8): Likewise.
(__arm_vaddvaq_p_s8): Likewise.
(__arm_vabsq_m_s8): Likewise.
(__arm_vqrdmlsdhxq_s8): Likewise.
(__arm_vqrdmlsdhq_s8): Likewise.
(__arm_vqrdmlashq_n_s8): Likewise.
(__arm_vqrdmlahq_n_s8): Likewise.
(__arm_vqrdmladhxq_s8): Likewise.
(__arm_vqrdmladhq_s8): Likewise.
(__arm_vqdmlsdhxq_s8): Likewise.
(__arm_vqdmlsdhq_s8): Likewise.
(__arm_vqdmlahq_n_s8): Likewise.
(__arm_vqdmladhxq_s8): Likewise.
(__arm_vqdmladhq_s8): Likewise.
(__arm_vmlsdavaxq_s8): Likewise.
(__arm_vmlsdavaq_s8): Likewise.
(__arm_vmlasq_n_s8): Likewise.
(__arm_vmlaq_n_s8): Likewise.
(__arm_vmladavaxq_s8): Likewise.
(__arm_vmladavaq_s8): Likewise.
(__arm_vsriq_n_s8): Likewise.
(__arm_vsliq_n_s8): Likewise.
(__arm_vpselq_u16): Likewise.
(__arm_vpselq_s16): Likewise.
(__arm_vrev64q_m_u16): Likewise.
(__arm_vqrdmlashq_n_u16): Likewise.
(__arm_vqrdmlahq_n_u16): Likewise.
(__arm_vqdmlahq_n_u16): Likewise.
(__arm_vmvnq_m_u16): Likewise.
(__arm_vmlasq_n_u16): Likewise.
(__arm_vmlaq_n_u16): Likewise.
(__arm_vmladavq_p_u16): Likewise.
(__arm_vmladavaq_u16): Likewise.
(__arm_vminvq_p_u16): Likewise.
(__arm_vmaxvq_p_u16): Likewise.
(__arm_vdupq_m_n_u16): Likewise.
(__arm_vcmpneq_m_u16): Likewise.
(__arm_vcmpneq_m_n_u16): Likewise.
(__arm_vcmphiq_m_u16): Likewise.
(__arm_vcmphiq_m_n_u16): Likewise.
(__arm_vcmpeqq_m_u16): Likewise.
(__arm_vcmpeqq_m_n_u16): Likewise.
(__arm_vcmpcsq_m_u16): Likewise.
(__arm_vcmpcsq_m_n_u16): Likewise.
(__arm_vclzq_m_u16): Likewise.
(__arm_vaddvaq_p_u16): Likewise.
(__arm_vsriq_n_u16): Likewise.
(__arm_vsliq_n_u16): Likewise.
(__arm_vshlq_m_r_u16): Likewise.
(__arm_vrshlq_m_n_u16): Likewise.
(__arm_vqshlq_m_r_u16): Likewise.
(__arm_vqrshlq_m_n_u16): Likewise.
(__arm_vminavq_p_s16): Likewise.
(__arm_vminaq_m_s16): Likewise.
(__arm_vmaxavq_p_s16): Likewise.
(__arm_vmaxaq_m_s16): Likewise.
(__arm_vcmpneq_m_s16): Likewise.
(__arm_vcmpneq_m_n_s16): Likewise.
(__arm_vcmpltq_m_s16): Likewise.
(__arm_vcmpltq_m_n_s16): Likewise.
(__arm_vcmpleq_m_s16): Likewise.
(__arm_vcmpleq_m_n_s16): Likewise.
(__arm_vcmpgtq_m_s16): Likewise.
(__arm_vcmpgtq_m_n_s16): Likewise.
(__arm_vcmpgeq_m_s16): Likewise.
(__arm_vcmpgeq_m_n_s16): Likewise.
(__arm_vcmpeqq_m_s16): Likewise.
(__arm_vcmpeqq_m_n_s16): Likewise.
(__arm_vshlq_m_r_s16): Likewise.
(__arm_vrshlq_m_n_s16): Likewise.
(__arm_vrev64q_m_s16): Likewise.
(__arm_vqshlq_m_r_s16): Likewise.
(__arm_vqrshlq_m_n_s16): Likewise.
(__arm_vqnegq_m_s16): Likewise.
(__arm_vqabsq_m_s16): Likewise.
(__arm_vnegq_m_s16): Likewise.
(__arm_vmvnq_m_s16): Likewise.
(__arm_vmlsdavxq_p_s16): Likewise.
(__arm_vmlsdavq_p_s16): Likewise.
(__arm_vmladavxq_p_s16): Likewise.
(__arm_vmladavq_p_s16): Likewise.
(__arm_vminvq_p_s16): Likewise.
(__arm_vmaxvq_p_s16): Likewise.
(__arm_vdupq_m_n_s16): Likewise.
(__arm_vclzq_m_s16): Likewise.
(__arm_vclsq_m_s16): Likewise.
(__arm_vaddvaq_p_s16): Likewise.
(__arm_vabsq_m_s16): Likewise.
(__arm_vqrdmlsdhxq_s16): Likewise.
(__arm_vqrdmlsdhq_s16): Likewise.
(__arm_vqrdmlashq_n_s16): Likewise.
(__arm_vqrdmlahq_n_s16): Likewise.
(__arm_vqrdmladhxq_s16): Likewise.
(__arm_vqrdmladhq_s16): Likewise.
(__arm_vqdmlsdhxq_s16): Likewise.
(__arm_vqdmlsdhq_s16): Likewise.
(__arm_vqdmlahq_n_s16): Likewise.
(__arm_vqdmladhxq_s16): Likewise.
(__arm_vqdmladhq_s16): Likewise.
(__arm_vmlsdavaxq_s16): Likewise.
(__arm_vmlsdavaq_s16): Likewise.
(__arm_vmlasq_n_s16): Likewise.
(__arm_vmlaq_n_s16): Likewise.
(__arm_vmladavaxq_s16): Likewise.
(__arm_vmladavaq_s16): Likewise.
(__arm_vsriq_n_s16): Likewise.
(__arm_vsliq_n_s16): Likewise.
(__arm_vpselq_u32): Likewise.
(__arm_vpselq_s32): Likewise.
(__arm_vrev64q_m_u32): Likewise.
(__arm_vqrdmlashq_n_u32): Likewise.
(__arm_vqrdmlahq_n_u32): Likewise.
(__arm_vqdmlahq_n_u32): Likewise.
(__arm_vmvnq_m_u32): Likewise.
(__arm_vmlasq_n_u32): Likewise.
(__arm_vmlaq_n_u32): Likewise.
(__arm_vmladavq_p_u32): Likewise.
(__arm_vmladavaq_u32): Likewise.
(__arm_vminvq_p_u32): Likewise.
(__arm_vmaxvq_p_u32): Likewise.
(__arm_vdupq_m_n_u32): Likewise.
(__arm_vcmpneq_m_u32): Likewise.
(__arm_vcmpneq_m_n_u32): Likewise.
(__arm_vcmphiq_m_u32): Likewise.
(__arm_vcmphiq_m_n_u32): Likewise.
(__arm_vcmpeqq_m_u32): Likewise.
(__arm_vcmpeqq_m_n_u32): Likewise.
(__arm_vcmpcsq_m_u32): Likewise.
(__arm_vcmpcsq_m_n_u32): Likewise.
(__arm_vclzq_m_u32): Likewise.
(__arm_vaddvaq_p_u32): Likewise.
(__arm_vsriq_n_u32): Likewise.
(__arm_vsliq_n_u32): Likewise.
(__arm_vshlq_m_r_u32): Likewise.
(__arm_vrshlq_m_n_u32): Likewise.
(__arm_vqshlq_m_r_u32): Likewise.
(__arm_vqrshlq_m_n_u32): Likewise.
(__arm_vminavq_p_s32): Likewise.
(__arm_vminaq_m_s32): Likewise.
(__arm_vmaxavq_p_s32): Likewise.
(__arm_vmaxaq_m_s32): Likewise.
(__arm_vcmpneq_m_s32): Likewise.
(__arm_vcmpneq_m_n_s32): Likewise.
(__arm_vcmpltq_m_s32): Likewise.
(__arm_vcmpltq_m_n_s32): Likewise.
(__arm_vcmpleq_m_s32): Likewise.
(__arm_vcmpleq_m_n_s32): Likewise.
(__arm_vcmpgtq_m_s32): Likewise.
(__arm_vcmpgtq_m_n_s32): Likewise.
(__arm_vcmpgeq_m_s32): Likewise.
(__arm_vcmpgeq_m_n_s32): Likewise.
(__arm_vcmpeqq_m_s32): Likewise.
(__arm_vcmpeqq_m_n_s32): Likewise.
(__arm_vshlq_m_r_s32): Likewise.
(__arm_vrshlq_m_n_s32): Likewise.
(__arm_vrev64q_m_s32): Likewise.
(__arm_vqshlq_m_r_s32): Likewise.
(__arm_vqrshlq_m_n_s32): Likewise.
(__arm_vqnegq_m_s32): Likewise.
(__arm_vqabsq_m_s32): Likewise.
(__arm_vnegq_m_s32): Likewise.
(__arm_vmvnq_m_s32): Likewise.
(__arm_vmlsdavxq_p_s32): Likewise.
(__arm_vmlsdavq_p_s32): Likewise.
(__arm_vmladavxq_p_s32): Likewise.
(__arm_vmladavq_p_s32): Likewise.
(__arm_vminvq_p_s32): Likewise.
(__arm_vmaxvq_p_s32): Likewise.
(__arm_vdupq_m_n_s32): Likewise.
(__arm_vclzq_m_s32): Likewise.
(__arm_vclsq_m_s32): Likewise.
(__arm_vaddvaq_p_s32): Likewise.
(__arm_vabsq_m_s32): Likewise.
(__arm_vqrdmlsdhxq_s32): Likewise.
(__arm_vqrdmlsdhq_s32): Likewise.
(__arm_vqrdmlashq_n_s32): Likewise.
(__arm_vqrdmlahq_n_s32): Likewise.
(__arm_vqrdmladhxq_s32): Likewise.
(__arm_vqrdmladhq_s32): Likewise.
(__arm_vqdmlsdhxq_s32): Likewise.
(__arm_vqdmlsdhq_s32): Likewise.
(__arm_vqdmlahq_n_s32): Likewise.
(__arm_vqdmladhxq_s32): Likewise.
(__arm_vqdmladhq_s32): Likewise.
(__arm_vmlsdavaxq_s32): Likewise.
(__arm_vmlsdavaq_s32): Likewise.
(__arm_vmlasq_n_s32): Likewise.
(__arm_vmlaq_n_s32): Likewise.
(__arm_vmladavaxq_s32): Likewise.
(__arm_vmladavaq_s32): Likewise.
(__arm_vsriq_n_s32): Likewise.
(__arm_vsliq_n_s32): Likewise.
(__arm_vpselq_u64): Likewise.
(__arm_vpselq_s64): Likewise.
(vcmpneq_m_n): Define polymorphic variant.
(vcmpneq_m): Likewise.
(vqrdmlsdhq): Likewise.
(vqrdmlsdhxq): Likewise.
(vqrshlq_m_n): Likewise.
(vqshlq_m_r): Likewise.
(vrev64q_m): Likewise.
(vrshlq_m_n): Likewise.
(vshlq_m_r): Likewise.
(vsliq_n): Likewise.
(vsriq_n): Likewise.
(vqrdmlashq_n): Likewise.
(vqrdmlahq): Likewise.
(vqrdmladhxq): Likewise.
(vqrdmladhq): Likewise.
(vqnegq_m): Likewise.
(vqdmlsdhxq): Likewise.
(vabsq_m): Likewise.
(vclsq_m): Likewise.
(vclzq_m): Likewise.
(vcmpgeq_m): Likewise.
(vcmpgeq_m_n): Likewise.
(vdupq_m_n): Likewise.
(vmaxaq_m): Likewise.
(vmlaq_n): Likewise.
(vmlasq_n): Likewise.
(vmvnq_m): Likewise.
(vnegq_m): Likewise.
(vpselq): Likewise.
(vqdmlahq_n): Likewise.
(vqrdmlahq_n): Likewise.
(vqdmlsdhq): Likewise.
(vqdmladhq): Likewise.
(vqabsq_m): Likewise.
(vminaq_m): Likewise.
(vrmlaldavhaq): Likewise.
(vmlsdavxq_p): Likewise.
(vmlsdavq_p): Likewise.
(vmlsdavaxq): Likewise.
(vmlsdavaq): Likewise.
(vaddvaq_p): Likewise.
(vcmpcsq_m_n): Likewise.
(vcmpcsq_m): Likewise.
(vcmpeqq_m_n): Likewise.
(vcmpeqq_m): Likewise.
(vmladavxq_p): Likewise.
(vmladavq_p): Likewise.
(vmladavaxq): Likewise.
(vmladavaq): Likewise.
(vminvq_p): Likewise.
(vminavq_p): Likewise.
(vmaxvq_p): Likewise.
(vmaxavq_p): Likewise.
(vcmpltq_m_n): Likewise.
(vcmpltq_m): Likewise.
(vcmpleq_m): Likewise.
(vcmpleq_m_n): Likewise.
(vcmphiq_m_n): Likewise.
(vcmphiq_m): Likewise.
(vcmpgtq_m_n): Likewise.
(vcmpgtq_m): Likewise.
* config/arm/arm_mve_builtins.def (TERNOP_NONE_NONE_NONE_IMM): Use
builtin qualifier.
(TERNOP_NONE_NONE_NONE_NONE): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE): Likewise.
(TERNOP_UNONE_UNONE_UNONE_IMM): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE): Likewise.
* config/arm/constraints.md (Rc): Define constraint to check constant is
in the range of 0 to 15.
(Re): Define constraint to check constant is in the range of 0 to 31.
* config/arm/mve.md (VADDVAQ_P): Define iterator.
(VCLZQ_M): Likewise.
(VCMPEQQ_M_N): Likewise.
(VCMPEQQ_M): Likewise.
(VCMPNEQ_M_N): Likewise.
(VCMPNEQ_M): Likewise.
(VDUPQ_M_N): Likewise.
(VMAXVQ_P): Likewise.
(VMINVQ_P): Likewise.
(VMLADAVAQ): Likewise.
(VMLADAVQ_P): Likewise.
(VMLAQ_N): Likewise.
(VMLASQ_N): Likewise.
(VMVNQ_M): Likewise.
(VPSELQ): Likewise.
(VQDMLAHQ_N): Likewise.
(VQRDMLAHQ_N): Likewise.
(VQRDMLASHQ_N): Likewise.
(VQRSHLQ_M_N): Likewise.
(VQSHLQ_M_R): Likewise.
(VREV64Q_M): Likewise.
(VRSHLQ_M_N): Likewise.
(VSHLQ_M_R): Likewise.
(VSLIQ_N): Likewise.
(VSRIQ_N): Likewise.
(mve_vabsq_m_s<mode>): Define RTL pattern.
(mve_vaddvaq_p_<supf><mode>): Likewise.
(mve_vclsq_m_s<mode>): Likewise.
(mve_vclzq_m_<supf><mode>): Likewise.
(mve_vcmpcsq_m_n_u<mode>): Likewise.
(mve_vcmpcsq_m_u<mode>): Likewise.
(mve_vcmpeqq_m_n_<supf><mode>): Likewise.
(mve_vcmpeqq_m_<supf><mode>): Likewise.
(mve_vcmpgeq_m_n_s<mode>): Likewise.
(mve_vcmpgeq_m_s<mode>): Likewise.
(mve_vcmpgtq_m_n_s<mode>): Likewise.
(mve_vcmpgtq_m_s<mode>): Likewise.
(mve_vcmphiq_m_n_u<mode>): Likewise.
(mve_vcmphiq_m_u<mode>): Likewise.
(mve_vcmpleq_m_n_s<mode>): Likewise.
(mve_vcmpleq_m_s<mode>): Likewise.
(mve_vcmpltq_m_n_s<mode>): Likewise.
(mve_vcmpltq_m_s<mode>): Likewise.
(mve_vcmpneq_m_n_<supf><mode>): Likewise.
(mve_vcmpneq_m_<supf><mode>): Likewise.
(mve_vdupq_m_n_<supf><mode>): Likewise.
(mve_vmaxaq_m_s<mode>): Likewise.
(mve_vmaxavq_p_s<mode>): Likewise.
(mve_vmaxvq_p_<supf><mode>): Likewise.
(mve_vminaq_m_s<mode>): Likewise.
(mve_vminavq_p_s<mode>): Likewise.
(mve_vminvq_p_<supf><mode>): Likewise.
(mve_vmladavaq_<supf><mode>): Likewise.
(mve_vmladavq_p_<supf><mode>): Likewise.
(mve_vmladavxq_p_s<mode>): Likewise.
(mve_vmlaq_n_<supf><mode>): Likewise.
(mve_vmlasq_n_<supf><mode>): Likewise.
(mve_vmlsdavq_p_s<mode>): Likewise.
(mve_vmlsdavxq_p_s<mode>): Likewise.
(mve_vmvnq_m_<supf><mode>): Likewise.
(mve_vnegq_m_s<mode>): Likewise.
(mve_vpselq_<supf><mode>): Likewise.
(mve_vqabsq_m_s<mode>): Likewise.
(mve_vqdmlahq_n_<supf><mode>): Likewise.
(mve_vqnegq_m_s<mode>): Likewise.
(mve_vqrdmladhq_s<mode>): Likewise.
(mve_vqrdmladhxq_s<mode>): Likewise.
(mve_vqrdmlahq_n_<supf><mode>): Likewise.
(mve_vqrdmlashq_n_<supf><mode>): Likewise.
(mve_vqrdmlsdhq_s<mode>): Likewise.
(mve_vqrdmlsdhxq_s<mode>): Likewise.
(mve_vqrshlq_m_n_<supf><mode>): Likewise.
(mve_vqshlq_m_r_<supf><mode>): Likewise.
(mve_vrev64q_m_<supf><mode>): Likewise.
(mve_vrshlq_m_n_<supf><mode>): Likewise.
(mve_vshlq_m_r_<supf><mode>): Likewise.
(mve_vsliq_n_<supf><mode>): Likewise.
(mve_vsriq_n_<supf><mode>): Likewise.
(mve_vqdmlsdhxq_s<mode>): Likewise.
(mve_vqdmlsdhq_s<mode>): Likewise.
(mve_vqdmladhxq_s<mode>): Likewise.
(mve_vqdmladhq_s<mode>): Likewise.
(mve_vmlsdavaxq_s<mode>): Likewise.
(mve_vmlsdavaq_s<mode>): Likewise.
(mve_vmladavaxq_s<mode>): Likewise.
* config/arm/predicates.md (mve_imm_15):Define predicate to check the
matching constraint Rc.
(mve_imm_31): Define predicate to check the matching constraint Re.
gcc/testsuite/ChangeLog:
2020-03-18 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabsq_m_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vaddvaq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vclzq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpcsq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgeq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpgtq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmphiq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpleq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpltq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpneq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vdupq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxaq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmaxvq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminaq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vminvq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavaxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavq_p_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmladavxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlaq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlasq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavaxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmlsdavxq_p_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vmvnq_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vnegq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u64.c: Likewise.
* gcc.target/arm/mve/intrinsics/vpselq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqabsq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmladhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlahq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqdmlsdhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqnegq_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmladhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlahq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlashq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrdmlsdhxq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqshlq_m_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrev64q_m_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrshlq_m_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlq_m_r_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsliq_n_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vsriq_n_u8.c: Likewise.
Tobias Burnus [Wed, 18 Mar 2020 15:28:08 +0000 (16:28 +0100)]
Fix libgomp.oacc-fortran/atomic_capture-1.f90
2020-03-18 Julian Brown <julian@codesourcery.com>
Tobias Burnus <tobias@codesourcery.com>
* testsuite/libgomp.oacc-fortran/atomic_capture-1.f90: Really make
it work concurrently.
David Malcolm [Wed, 11 Mar 2020 21:06:41 +0000 (17:06 -0400)]
analyzer: make summarized dumps more comprehensive
The previous implementation of summarized dumps within
region_model::dump_to_pp showed only the "top-level" keys within the
current frame and for globals, and thus didn't e.g. show the values
of fields of structs, or elements of arrays.
This patch rewrites it to gather a vec of representative path_vars
for all regions, using this to generate the dump, so that all expressible
lvalues ought to make it to the summarized dump.
gcc/analyzer/ChangeLog:
* region-model.cc: Include "stor-layout.h".
(region_model::dump_to_pp): Rather than calling
dump_summary_of_map on each of the current frame and the globals,
instead get a vec of representative path_vars for all regions,
and then dump a summary of all of them.
(region_model::dump_summary_of_map): Delete, rewriting into...
(region_model::dump_summary_of_rep_path_vars): ...this new
function, working on a vec of path_vars.
(region_model::set_value): New overload.
(region_model::get_representative_path_var): Rename
"parent_region" local to "parent_reg" and consolidate with other
local. Guard test for grandparent being stack on parent_reg being
non-NULL. Move handling for parent being an array_region to
within guard for parent_reg being non-NULL.
(selftest::make_test_compound_type): New function.
(selftest::test_dump_2): New selftest.
(selftest::test_dump_3): New selftest.
(selftest::test_stack_frames): Update expected output from
simplified dump to show "a" and "b" from parent frame and "y" in
child frame.
(selftest::analyzer_region_model_cc_tests): Call test_dump_2 and
test_dump_3.
* region-model.h (region_model::set_value): New overload decl.
(region_model::dump_summary_of_map): Delete.
(region_model::dump_summary_of_rep_path_vars): New.
David Malcolm [Tue, 10 Mar 2020 22:50:03 +0000 (18:50 -0400)]
analyzer: add test coverage for fixed ICE [PR94047]
PR analyzer/94047 reports an ICE, which turned out to be caused
by the erroneous use of TREE_TYPE on the view region's type
in region_model::get_representative_path_var that I introduced
in
r10-7024-ge516294a1acb28aaaad44cfd583cc6a80354044e and
fixed in g:
787477a226033e36be3f6d16b71be13dd917e982.
This patch adds a regression test for the ICE.
gcc/testsuite/ChangeLog:
PR analyzer/94047
* gcc.dg/analyzer/pr94047.c: New test.
David Malcolm [Tue, 17 Mar 2020 18:43:43 +0000 (14:43 -0400)]
analyzer: introduce noop_region_model_context
tentative_region_model_context and test_region_model_context are both
forced to implement numerous pure virtual vfuncs of the abstract
region_model_context.
This patch adds a noop_region_model_context which provides empty
implementations of all of region_model_context's pure virtual functions,
and subclasses the above classes from that, rather than from
region_model_context directly.
gcc/analyzer/ChangeLog:
* region-model.h (class noop_region_model_context): New subclass
of region_model_context.
(class tentative_region_model_context): Inherit from
noop_region_model_context rather than from region_model_context;
drop redundant vfunc implementations.
(class test_region_model_context): Likewise.
David Malcolm [Tue, 17 Mar 2020 14:25:14 +0000 (10:25 -0400)]
analyzer: tweaks to exploded_node ctor
I have followup work that touches this, so it's easiest to get this
cleanup in first.
gcc/analyzer/ChangeLog:
* engine.cc (exploded_node::exploded_node): Move implementation
here from header; accept point_and_state by const reference rather
than by value.
* exploded-graph.h (exploded_node::exploded_node): Pass
point_and_state by const reference rather than by value. Move
body to engine.cc.
Jonathan Wakely [Wed, 18 Mar 2020 12:55:29 +0000 (12:55 +0000)]
libstdc++ Fix compilation of <stop_token> with Clang
Clang 9 supports C++20 via -std=c++2a but doesn't support three-way
comparisons, so <stop_token> fails to compile. When the compiler doesn't
support default comparisons, this patch defines operator== and
operator!= for the _Stop_state_ref class. That is enough for the header
to be compiled with Clang. It allows operator== for stop_token and
stop_source to work, but not operator!= because that isn't explicitly
defined.
* include/std/stop_token (stop_token::_Stop_state_ref): Define
comparison operators explicitly if the compiler won't synthesize them.
Jonathan Wakely [Wed, 18 Mar 2020 12:55:29 +0000 (12:55 +0000)]
libstdc++: Fix compilation with released versions of Clang
Clang 9 supports C++20 via -std=c++2a but doesn't support Concepts, so
several of the new additions related to the Ranges library fail to
compile with -std=c++2a. The new definition of iterator_traits and the
definition of default_sentinel_t are guarded by __cpp_lib_concepts, so
check that in addition to __cplusplus > 201703L.
* include/bits/stl_algobase.h (__lexicographical_compare_aux): Check
__cpp_lib_concepts before using iter_reference_t.
* include/bits/stream_iterator.h (istream_iterator): Check
__cpp_lib_concepts before using default_sentinel_t.
* include/bits/streambuf_iterator.h (istreambuf_iterator): Likewise.
Andrew Stubbs [Tue, 17 Mar 2020 12:49:19 +0000 (12:49 +0000)]
amdgcn: Fix vector compare modes
The GCN VCC register has 64 CC values in one registers, one bit for each
vector lane.
Previously we avoided problems with invalid optimizations by not declaring
a mode for the comparison operators, but it turns out that causes other
problems (and build warnings).
Instead, the optimization issues can be avoided by setting
STORE_REGISTER_VALUE to -1, meaning that all the bits are significant.
(It would be better if we could set STORE_REGISTER_VALUE according to the
known mask or vector size, but we can't.)
2020-03-18 Andrew Stubbs <ams@codesourcery.com>
gcc/
* config/gcn/gcn-valu.md (vec_cmp<mode>di): Set operand 1 to DImode.
(vec_cmp<mode>di_dup): Likewise.
* config/gcn/gcn.h (STORE_FLAG_VALUE): Set to -1.
Andrew Stubbs [Tue, 3 Mar 2020 17:36:49 +0000 (17:36 +0000)]
amdgcn: Add cond_add/sub/and/ior/xor for all vector modes
2020-03-18 Andrew Stubbs <ams@codesourcery.com>
gcc/
* config/gcn/gcn-valu.md (COND_MODE): Delete.
(COND_INT_MODE): Delete.
(cond_op): Add "mult".
(cond_<expander><mode>): Use VEC_ALLREG_MODE.
(cond_<expander><mode>): Use VEC_ALLREG_INT_MODE.
Nathan Sidwell [Wed, 18 Mar 2020 12:16:28 +0000 (05:16 -0700)]
PR c++/94147 - mangling of lambdas assigned to globals
This patch implements Jason's suggestion of pushing a lambda scope
when parsing a global variable initializer. That bit worked fine, but
happened to cause g++.dg/opt/dump1.C to not give any
used-but-not-defined warnings.
The reason was no_linkage_check, which considers any lambda that has
an extra-scope to have linkage. Which is technically correct. Except
that we think that all types that have linkage have external linkage.
Our representation of linkage and visibility is somewhat inaccurate,
particularly when it comes to types. We have TREE_PUBLIC,
DECL_EXTERNAL, DECL_VISIBILITY, DECL_COMDAT, DECL_NOT_REALLY_EXTERN.
It could really do with a through cleanup, but that won't be a simple
task.
The best I could come up with was seeing if the extra scope was a
VAR_DECL, and if that was TREE_PUBLIC and the var was inline (its
COMDATness is sadly not set at that point) or a template
instantiation, then the lambda had linkage. Otherwise it's as-if it
has no-linkage from the POV of compiler internals.
This is an ABI change (so we should document it), but it's changing
mangling from an unpredictable (in practice) counter, to something the
ABI defines. So I'm not concerned about mangling-changed warnings, or
preserving the broken mangling under some ABI selection flag. Code
that did this worked by accident within a single TU. It'll continue
to work by design there, and across TUs.
* parser.c (cp_parser_init_declarator): Namespace-scope variables
provide a lambda scope.
* tree.c (no_linkage_check): Lambdas with a variable for extra
scope have a linkage from the variable.
Richard Biener [Wed, 18 Mar 2020 12:11:30 +0000 (13:11 +0100)]
middle-end/94206 fix memset folding to avoid types with padding
This makes sure that the store a memset is folded to uses a type
covering all bits.
2020-03-18 Richard Biener <rguenther@suse.de>
PR middle-end/94206
* gimple-fold.c (gimple_fold_builtin_memset): Avoid using
partial int modes or not mode-precision integer types for
the store.
* gcc.dg/torture/pr94206.c: New testcase.
Jakub Jelinek [Wed, 18 Mar 2020 11:56:26 +0000 (12:56 +0100)]
Fix up duplicated duplicated words in comments
Another set of duplicated word fixes for things I've missed last time.
These include e.g. *.cc files I forgot about, or duplicated words at the start
or end of line.
2020-03-18 Jakub Jelinek <jakub@redhat.com>
* asan.c (get_mem_refs_of_builtin_call): Fix up duplicated word issue
in a comment.
* config/arc/arc.c (frame_stack_add): Likewise.
* gimple-loop-versioning.cc (loop_versioning::analyze_arbitrary_term):
Likewise.
* ipa-predicate.c (predicate::remap_after_inlining): Likewise.
* tree-ssa-strlen.h (handle_printf_call): Likewise.
* tree-ssa-strlen.c (is_strlen_related_p): Likewise.
* optinfo-emit-json.cc (optrecord_json_writer::add_record): Likewise.
analyzer/
* sm-malloc.cc (malloc_state_machine::on_stmt): Fix up duplicated word
issue in a comment.
* region-model.cc (region_model::make_region_for_unexpected_tree_code,
region_model::delete_region_and_descendents): Likewise.
* engine.cc (class exploded_cluster): Likewise.
* diagnostic-manager.cc (class path_builder): Likewise.
cp/
* constraint.cc (resolve_function_concept_check, subsumes_constraints,
strictly_subsumes): Fix up duplicated word issue in a comment.
* coroutines.cc (build_init_or_final_await, captures_temporary):
Likewise.
* logic.cc (dnf_size_r, cnf_size_r): Likewise.
* pt.c (append_type_to_template_for_access_check): Likewise.
d/
* expr.cc (ExprVisitor::visit (CatAssignExp *)): Fix up duplicated
word issue in a comment.
* d-target.cc (Target::FPTypeProperties<T>::max): Likewise.
fortran/
* class.c (generate_finalization_wrapper): Fix up duplicated word
issue in a comment.
* trans-types.c (gfc_get_nodesc_array_type): Likewise.
Duan bo [Wed, 18 Mar 2020 10:18:39 +0000 (10:18 +0000)]
aarch64: Fix SYMBOL_TINY_GOT handling for ILP32 [PR94201]
The SYMBOL_TINY_GOT case in aarch64_load_symref_appropriately was
missing support for ILP32. This caused an ICE on the testcase.
2020-03-18 Duan bo <duanbo3@huawei.com>
gcc/
PR target/94201
* config/aarch64/aarch64.md (ldr_got_tiny): Delete.
(@ldr_got_tiny_<mode>): New pattern.
(ldr_got_tiny_sidi): Likewise.
* config/aarch64/aarch64.c (aarch64_load_symref_appropriately): Use
them to handle SYMBOL_TINY_GOT for ILP32.
gcc/testsuite/
PR target/94201
* gcc.target/aarch64/pr94201.c:New test.
Richard Sandiford [Mon, 16 Mar 2020 15:26:35 +0000 (15:26 +0000)]
aarch64: Treat p12-p15 as call-preserved in SVE PCS functions
Due to a stupid mistake that I can't really explain, I'd got the
treatment of p12-p15 mixed up when adding support for the SVE PCS.
The registers are supposed to be call-preserved rather than
call-clobbered.
The fix is simple, but it has quite a big effect on the PCS tests
(as it should!).
2020-03-18 Richard Sandiford <richard.sandiford@arm.com>
gcc/
* config/aarch64/aarch64.c (aarch64_sve_abi): Treat p12-p15 as
call-preserved for SVE PCS functions.
(aarch64_layout_frame): Cope with up to 12 predicate save slots.
Optimize the case in which there are no following vector save slots.
gcc/testsuite/
* gcc.target/aarch64/sve/acle/general/cpy_1.c: Leave gaps for in the
check-function-bodies patterns for p15 to be saved.
* gcc.target/aarch64/sve/pcs/args_1.c (callee_pred): Expect two
predicates to be saved.
* gcc.target/aarch64/sve/pcs/saves_1_be_nowrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_be_wrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_le_nowrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_1_le_wrap.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/saves_2_be_nowrap.c: Expect p12-p15
to be saved and restored.
* gcc.target/aarch64/sve/pcs/saves_2_be_wrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_2_le_nowrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_2_le_wrap.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_4_be.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_4_le.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_5_be.c: Likewise.
* gcc.target/aarch64/sve/pcs/saves_5_le.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_1.c (test_1): Likewise.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/stack_clash_1_128.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
* gcc.target/aarch64/sve/pcs/stack_clash_1_256.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_512.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_1024.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 16 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_1_2048.c (test_1): Expect
p12-p15 to be saved and restored.
(test_2): Remove p12-p15 from the clobber list.
(test_4): Expect only 32 bytes of stack to be allocated for the
predicate save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_2_256.c: Use z16 rather
than p4 to create a vector-sized save slot.
* gcc.target/aarch64/sve/pcs/stack_clash_2_512.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_2_1024.c: Likewise.
* gcc.target/aarch64/sve/pcs/stack_clash_2_2048.c: Likewise.
Tobias Burnus [Wed, 18 Mar 2020 11:07:54 +0000 (12:07 +0100)]
libgomp testsuite - disable long double for AMDGCN
* testsuite/libgomp.oacc-c++/firstprivate-mappings-1.C: Add
#define DO_LONG_DOUBLE; set to 1, except for nvidia + gcn.
* libgomp.oacc-c-c++-common/firstprivate-mappings-1.c: Likewise.
* g++.dg/goacc/firstprivate-mappings-1.C: Only set DO_LONG_DOUBLE if
not defined; update comments.
* c-c++-common/goacc/firstprivate-mappings-1.c: Likewise.
Richard Biener [Wed, 18 Mar 2020 08:13:17 +0000 (09:13 +0100)]
middle-end/94188 fix fold of addr expression generation
This adds a missing type conversion to build_fold_addr_expr and adjusts
fallout - build_fold_addr_expr was used as a convenience to build an
ADDR_EXPR but some callers do not expect the result to be simplified
to something else.
2020-03-18 Richard Biener <rguenther@suse.de>
PR middle-end/94188
* fold-const.c (build_fold_addr_expr): Convert address to
correct type.
* asan.c (maybe_create_ssa_name): Strip useless type conversions.
* gimple-fold.c (gimple_fold_stmt_to_constant_1): Use build1
to build the ADDR_EXPR which we don't really want to simplify.
* tree-ssa-dom.c (record_equivalences_from_stmt): Likewise.
* tree-ssa-loop-im.c (gather_mem_refs_stmt): Likewise.
* tree-ssa-forwprop.c (forward_propagate_addr_expr_1): Likewise.
(simplify_builtin_call): Strip useless type conversions.
* tree-ssa-strlen.c (new_strinfo): Likewise.
* gcc.dg/pr94188.c: New testcase.
Jakub Jelinek [Wed, 18 Mar 2020 07:53:23 +0000 (08:53 +0100)]
c++: Diagnose a deduction guide in a wrong scope [PR91759]
The following testcase is accepts-invalid since
r7-6608-ga56c0ac08242269b.
Before that change we had this
"deduction guide %qD must be declared in the same scope as %qT"
diagnostics for it, after the change it is expected to be diagnosed
in set_decl_namespace at the not_found: label in there. On this testcase
nothing is diagnosed though, because set_decl_namespace isn't called at all,
as in_namespace is NULL.
The following patch restores the old warning but does it only in case we
don't call set_decl_namespace.
2020-03-18 Jakub Jelinek <jakub@redhat.com>
PR c++/91759
* decl.c (grokfndecl): Restore old diagnostics about deduction
guide declared in different scope if in_namespace is NULL_TREE.
* g++.dg/cpp1z/class-deduction72.C: New test.
Alexey Neyman [Sun, 15 Mar 2020 00:05:36 +0000 (17:05 -0700)]
dwarf: Generate DIEs for external variables with -g1 [93751]
-g1 is described in the manual to generate debug info for functions and
external variables. It does that for older debugging formats but not for
DWARF. This change brings DWARF in line with the rest of the debugging
formats and with the manual.
gcc/ChangeLog
2020-03-17 Alexey Neyman <stilor@att.net>
PR debug/93751
* dwarf2out.c (gen_decl_die): Proceed to generating the DIE if
the debug level is terse and the declaration is public. Do not
generate type info.
(dwarf2out_decl): Same.
(add_type_attribute): Return immediately if debug level is
terse.
Signed-off-by: Alexey Neyman <stilor@att.net>
Jason Merrill [Tue, 17 Mar 2020 09:45:02 +0000 (05:45 -0400)]
c++: Fix comment typo.
Jonathan Wakely [Wed, 18 Mar 2020 00:23:39 +0000 (00:23 +0000)]
libstdc++: Fix type-erasure in experimental::net::executor (PR 94203)
The _Tgt and _TgtImpl types that implement type-erasure didn't agree on
the virtual interface, so failed as soon as they were instantiated. With
Clang they failed even sooner. The interface was also dependent on
whether RTTI was enabled or not.
This patch fixes the broken virtual functions and makes the type work
without RTTI, by using a pointer to a specialization of a function
template (similar to the approaches in std::function and std::any).
The changes to the virtual functions would be an ABI change, except that
the previous code didn't even compile if instantiated. This is
experimental TS material anyway.
PR libstdc++/94203
* include/experimental/executor (executor::executor(Executor)): Call
make_shared directly instead of _M_create. Create _Tgt1 object.
(executor::executor(allocator_arg_t, const ProtoAlloc&, Executor)):
Call allocate_shared directly instead of _M_create. Create _Tgt2
object.
(executor::target_type): Add cast needed for new _Tgt interface.
(executor::target): Define when RTTI is disabled. Use _Tgt::_M_func.
(executor::_Tgt): Define the same interface whether RTTI is enabled or
not.
(executor::_Tgt::target_type, executor::_Tgt::target): Do not use
std::type_info in the interface.
(executor::_Tgt::_M_func): Add data member.
(executor::_TgtImpl): Replace with _Tgt1 and _Tgt2 class templates.
(executor::_Tgt1::_S_func): Define function to access target without
depending on RTTI.
(executor::_M_create): Remove.
(operator==, operator!=): Simplify comparisons for executor.
* include/experimental/socket (is_error_code_enum<socket_errc>):
Define specialization before use.
* testsuite/experimental/net/executor/1.cc: New test.
GCC Administrator [Wed, 18 Mar 2020 00:16:17 +0000 (00:16 +0000)]
Daily bump.
Uros Bizjak [Tue, 17 Mar 2020 22:01:04 +0000 (23:01 +0100)]
testsuite: Fix g++.dg/debug/dwarf2/const2b.C target selector
* g++.dg/debug/dwarf2/const2b.C (dg-do): Fix target selector.
Jakub Jelinek [Tue, 17 Mar 2020 21:32:34 +0000 (22:32 +0100)]
c: Handle C_TYPE_INCOMPLETE_VARS even for ENUMERAL_TYPEs [PR94172]
The following testcases ICE, because they contain extern variable
declarations with incomplete enum types that is later completed and after
that those variables are accessed. The ICEs are because the vars then may have
incorrect DECL_MODE etc., e.g. in the first case the var has SImode
DECL_MODE (the guessed mode for the enum), but the enum then actually has
DImode because its enumerators don't fit into unsigned int.
The following patch fixes it by using C_TYPE_INCOMPLETE_VARS not just on
incomplete struct/union types, but also incomplete enum types.
TYPE_VFIELD can't be used as it is TYPE_MIN_VALUE on ENUMERAL_TYPE,
thankfully TYPE_LANG_SLOT_1 has been used in the C FE only on
FUNCTION_TYPEs.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR c/94172
* c-tree.h (C_TYPE_INCOMPLETE_VARS): Define to TYPE_LANG_SLOT_1
instead of TYPE_VFIELD, and support it on {RECORD,UNION,ENUMERAL}_TYPE.
(TYPE_ACTUAL_ARG_TYPES): Check that it is only used on FUNCTION_TYPEs.
* c-decl.c (pushdecl): Push C_TYPE_INCOMPLETE_VARS also to
ENUMERAL_TYPEs.
(finish_incomplete_vars): New function, moved from finish_struct. Use
relayout_decl instead of layout_decl.
(finish_struct): Remove obsolete comment about C_TYPE_INCOMPLETE_VARS
being TYPE_VFIELD. Use finish_incomplete_vars.
(finish_enum): Clear C_TYPE_INCOMPLETE_VARS. Call
finish_incomplete_vars.
* c-typeck.c (c_build_qualified_type): Clear C_TYPE_INCOMPLETE_VARS
also on ENUMERAL_TYPEs.
* gcc.dg/pr94172-1.c: New test.
* gcc.dg/pr94172-2.c: New test.
Jakub Jelinek [Tue, 17 Mar 2020 20:21:16 +0000 (21:21 +0100)]
c++: Fix parsing of invalid enum specifiers [PR90995]
The testcase shows some accepts-invalid (the ones without alignas) and
ice-on-invalid-code (the ones with alignas) cases.
If the enum doesn't have an underlying type and is not a definition,
the caller retries to parse it as elaborated type specifier.
E.g. for enum struct S s it will then pedwarn that elaborated type specifier
shouldn't have the struct/class keywords.
The problem is if the enum specifier is not followed by { when it has
underlying type. In that case we have already called
cp_parser_parse_definitely to end the tentative parsing started at the
beginning of cp_parser_enum_specifier. But the
cp_parser_error (parser, "expected %<;%> or %<{%>");
doesn't emit any error because the whole function is called from yet another
tentative parse and the caller starts parsing the elaborated type
specifier where the cp_parser_enum_specifier stopped (i.e. after the
underlying type token(s)). The ultimate caller than commits the tentative
parsing (and even if it wouldn't, it wouldn't know what kind of error
to report). I think after seeing enum {,struct,class} : type not being
followed by { or ;, there is no reason not to report it right away, as it
can't be valid C++, which is what the patch does. Not sure if we shouldn't
also return error_mark_node instead of NULL_TREE, so that the caller doesn't
try to parse it as elaborated type specifier (the patch doesn't do that
right now).
Furthermore, while reading the code, I've noticed that
parser->colon_corrects_to_scope_p is saved and set to false at the start
of the function, but not restored back in some cases. Don't have a testcase
where this would be a problem, but it just seems wrong. Either we can in
the two spots replace return NULL_TREE; with { type = NULL_TREE; goto out; }
or we could perhaps abuse warning_sentinel or create a special class with
dtor to clean the flag up.
And lastly, I've fixed some formatting issues in the function while reading
it.
2020-03-17 Jakub Jelinek <jakub@redhat.com>
PR c++/90995
* parser.c (cp_parser_enum_specifier): Use temp_override for
parser->colon_corrects_to_scope_p, replace goto out with return.
If scoped enum or enum with underlying type is not followed by
{ or ;, call cp_parser_commit_to_tentative_parse before calling
cp_parser_error and make sure to return error_mark_node instead of
NULL_TREE. Formatting fixes.
* g++.dg/cpp0x/enum40.C: New test.
Richard Sandiford [Tue, 17 Mar 2020 15:39:42 +0000 (15:39 +0000)]
testsuite: Fix gcc.target/aarch64/advsimd-intrinsics/bfcvt-nosimd.c
2020-03-17 Richard Sandiford <richard.sandiford@arm.com>
gcc/testsuite/
* gcc.target/aarch64/advsimd-intrinsics/bfcvt-nosimd.c: Skip for
-fno-fat-lto-objects. Use tabs rather than spaces in the
check-function-bodies code.
Richard Sandiford [Tue, 17 Mar 2020 15:36:37 +0000 (15:36 +0000)]
aarch64: Fix bf16_v(ld|st)n.c failures for big-endian
gcc.target/aarch64/advsimd-intrinsics/bf16_vldn.c and
gcc.target/aarch64/advsimd-intrinsics/bf16_vstn.c were
failing for big-endian targets because the <Vmtype> in
aarch64_be_ld1<mode> and aarch64_be_st1<mode> had no
expansion for the bfloat16 modes.
2020-03-17 Richard Sandiford <richard.sandiford@arm.com>
gcc/
* config/aarch64/iterators.md (Vmtype): Handle V4BF and V8BF.
Ville Voutilainen [Tue, 17 Mar 2020 16:43:21 +0000 (18:43 +0200)]
Fix the ChangeLog after the __is_assignable/__is_constructible fix
Iain Sandoe [Tue, 17 Mar 2020 14:12:54 +0000 (14:12 +0000)]
coroutines, testsuite: Fix single test execution.
Invocations of the coro-torture.exp like 'coro-torture.exp=some-test.C' were
failing because DEFAULT_CXXFLAGS was undefined. Fixed by defining this
locally, if it has no pre-existing global value.
Srinath Parvathaneni [Tue, 17 Mar 2020 15:56:35 +0000 (15:56 +0000)]
[ARM][GCC][1/3x]: MVE intrinsics with ternary operands.
This patch supports following MVE ACLE intrinsics with ternary operands.
vabavq_s8, vabavq_s16, vabavq_s32, vbicq_m_n_s16, vbicq_m_n_s32, vbicq_m_n_u16, vbicq_m_n_u32, vcmpeqq_m_f16, vcmpeqq_m_f32, vcvtaq_m_s16_f16, vcvtaq_m_u16_f16, vcvtaq_m_s32_f32, vcvtaq_m_u32_f32, vcvtq_m_f16_s16, vcvtq_m_f16_u16, vcvtq_m_f32_s32, vcvtq_m_f32_u32, vqrshrnbq_n_s16, vqrshrnbq_n_u16, vqrshrnbq_n_s32, vqrshrnbq_n_u32, vqrshrunbq_n_s16, vqrshrunbq_n_s32, vrmlaldavhaq_s32, vrmlaldavhaq_u32, vshlcq_s8, vshlcq_u8, vshlcq_s16, vshlcq_u16, vshlcq_s32, vshlcq_u32, vabavq_s8, vabavq_s16, vabavq_s32.
Please refer to M-profile Vector Extension (MVE) intrinsics [1] for more details.
[1] https://developer.arm.com/architectures/instruction-sets/simd-isas/helium/mve-intrinsics
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* config/arm/arm-builtins.c (TERNOP_UNONE_UNONE_UNONE_IMM_QUALIFIERS):
Define qualifier for ternary operands.
(TERNOP_UNONE_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_NONE_QUALIFIERS): Likewise.
* config/arm/arm_mve.h (vabavq_s8): Define macro.
(vabavq_s16): Likewise.
(vabavq_s32): Likewise.
(vbicq_m_n_s16): Likewise.
(vbicq_m_n_s32): Likewise.
(vbicq_m_n_u16): Likewise.
(vbicq_m_n_u32): Likewise.
(vcmpeqq_m_f16): Likewise.
(vcmpeqq_m_f32): Likewise.
(vcvtaq_m_s16_f16): Likewise.
(vcvtaq_m_u16_f16): Likewise.
(vcvtaq_m_s32_f32): Likewise.
(vcvtaq_m_u32_f32): Likewise.
(vcvtq_m_f16_s16): Likewise.
(vcvtq_m_f16_u16): Likewise.
(vcvtq_m_f32_s32): Likewise.
(vcvtq_m_f32_u32): Likewise.
(vqrshrnbq_n_s16): Likewise.
(vqrshrnbq_n_u16): Likewise.
(vqrshrnbq_n_s32): Likewise.
(vqrshrnbq_n_u32): Likewise.
(vqrshrunbq_n_s16): Likewise.
(vqrshrunbq_n_s32): Likewise.
(vrmlaldavhaq_s32): Likewise.
(vrmlaldavhaq_u32): Likewise.
(vshlcq_s8): Likewise.
(vshlcq_u8): Likewise.
(vshlcq_s16): Likewise.
(vshlcq_u16): Likewise.
(vshlcq_s32): Likewise.
(vshlcq_u32): Likewise.
(vabavq_u8): Likewise.
(vabavq_u16): Likewise.
(vabavq_u32): Likewise.
(__arm_vabavq_s8): Define intrinsic.
(__arm_vabavq_s16): Likewise.
(__arm_vabavq_s32): Likewise.
(__arm_vabavq_u8): Likewise.
(__arm_vabavq_u16): Likewise.
(__arm_vabavq_u32): Likewise.
(__arm_vbicq_m_n_s16): Likewise.
(__arm_vbicq_m_n_s32): Likewise.
(__arm_vbicq_m_n_u16): Likewise.
(__arm_vbicq_m_n_u32): Likewise.
(__arm_vqrshrnbq_n_s16): Likewise.
(__arm_vqrshrnbq_n_u16): Likewise.
(__arm_vqrshrnbq_n_s32): Likewise.
(__arm_vqrshrnbq_n_u32): Likewise.
(__arm_vqrshrunbq_n_s16): Likewise.
(__arm_vqrshrunbq_n_s32): Likewise.
(__arm_vrmlaldavhaq_s32): Likewise.
(__arm_vrmlaldavhaq_u32): Likewise.
(__arm_vshlcq_s8): Likewise.
(__arm_vshlcq_u8): Likewise.
(__arm_vshlcq_s16): Likewise.
(__arm_vshlcq_u16): Likewise.
(__arm_vshlcq_s32): Likewise.
(__arm_vshlcq_u32): Likewise.
(__arm_vcmpeqq_m_f16): Likewise.
(__arm_vcmpeqq_m_f32): Likewise.
(__arm_vcvtaq_m_s16_f16): Likewise.
(__arm_vcvtaq_m_u16_f16): Likewise.
(__arm_vcvtaq_m_s32_f32): Likewise.
(__arm_vcvtaq_m_u32_f32): Likewise.
(__arm_vcvtq_m_f16_s16): Likewise.
(__arm_vcvtq_m_f16_u16): Likewise.
(__arm_vcvtq_m_f32_s32): Likewise.
(__arm_vcvtq_m_f32_u32): Likewise.
(vcvtaq_m): Define polymorphic variant.
(vcvtq_m): Likewise.
(vabavq): Likewise.
(vshlcq): Likewise.
(vbicq_m_n): Likewise.
(vqrshrnbq_n): Likewise.
(vqrshrunbq_n): Likewise.
* config/arm/arm_mve_builtins.def
(TERNOP_UNONE_UNONE_UNONE_IMM_QUALIFIERS): Use the builtin qualifer.
(TERNOP_UNONE_UNONE_NONE_NONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_IMM_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_IMM_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_UNONE_UNONE_UNONE_UNONE_QUALIFIERS): Likewise.
(TERNOP_NONE_NONE_NONE_NONE_QUALIFIERS): Likewise.
* config/arm/mve.md (VBICQ_M_N): Define iterator.
(VCVTAQ_M): Likewise.
(VCVTQ_M_TO_F): Likewise.
(VQRSHRNBQ_N): Likewise.
(VABAVQ): Likewise.
(VSHLCQ): Likewise.
(VRMLALDAVHAQ): Likewise.
(mve_vbicq_m_n_<supf><mode>): Define RTL pattern.
(mve_vcmpeqq_m_f<mode>): Likewise.
(mve_vcvtaq_m_<supf><mode>): Likewise.
(mve_vcvtq_m_to_f_<supf><mode>): Likewise.
(mve_vqrshrnbq_n_<supf><mode>): Likewise.
(mve_vqrshrunbq_n_s<mode>): Likewise.
(mve_vrmlaldavhaq_<supf>v4si): Likewise.
(mve_vabavq_<supf><mode>): Likewise.
(mve_vshlcq_<supf><mode>): Likewise.
(mve_vshlcq_<supf><mode>): Likewise.
(mve_vshlcq_vec_<supf><mode>): Define RTL expand.
(mve_vshlcq_carry_<supf><mode>): Likewise.
gcc/testsuite/ChangeLog:
2020-03-17 Andre Vieira <andre.simoesdiasvieira@arm.com>
Mihail Ionescu <mihail.ionescu@arm.com>
Srinath Parvathaneni <srinath.parvathaneni@arm.com>
* gcc.target/arm/mve/intrinsics/vabavq_s16.c: New test.
* gcc.target/arm/mve/intrinsics/vabavq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vabavq_u8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vbicq_m_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcmpeqq_m_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_s16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_s32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_u16_f16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtaq_m_u32_f32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f16_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f16_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f32_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vcvtq_m_f32_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrnbq_n_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_n_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vqrshrunbq_n_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vrmlaldavhaq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_s8.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u16.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u32.c: Likewise.
* gcc.target/arm/mve/intrinsics/vshlcq_u8.c: Likewise.