* cfgrtl.c (fixup_reorder_chain): Do not emit barriers to BB_FOOTER.
* postreload-gcse.c (bb_has_well_behaved_predecessors): Correct test
for table jump at the end of a basic block using tablejump_p.
* targhooks.c (default_invalid_within_doloop): Likewise.
* config/rs6000/rs6000.c (TARGET_INVALID_WITHIN_DOLOOP): Remove
target hook implementation that is identical to the default hook.
(rs6000_invalid_within_doloop): Remove.
* bb-reorder.c (fix_crossing_unconditional_branches): Remove set but
unused variable from tablejump_p call.
* rtl.def (JUMP_TABLE_DATA): New RTX_INSN object.
* rtl.h (RTX_PREV, RTX_NEXT): Adjust for new JUMP_TABLE_DATA.
(INSN_DELETED_P): Likewise.
(emit_jump_table_data): New prototype.
* gengtype.c (adjust_field_rtx_def): Handle JUMP_TABLE_DATA fields
after 4th as unused.
* print-rtl.c (print_rtl): Handle JUMP_TABLE_DATA.
* sched-vis.c (print_insn): Likewise.
* emit-rtl.c (active_insn_p): Consider JUMP_TABLE_DATA an active
insn for compatibility with back ends that use next_active_insn to
identify jump table data.
(set_insn_deleted): Remove no longer useful JUMP_TABLE_DATA_P check.
(remove_insn): Likewise.
(emit_insn): Do not accept JUMP_TABLE_DATA objects in insn chains
to be emitted.
(emit_debug_insn, emit_jump_insn, emit_call_insn, emit_label): Idem.
(emit_jump_table_data): New function.
* cfgbuild.c (inside_basic_block_p): A JUMP_INSN is always inside a
basic block, a JUMP_TABLE_DATA never is.
(control_flow_insn_p): JUMP_TABLE_DATA is not a control flow insn.
* cfgrtl.c (duplicate_insn_chain): Split handling of JUMP_TABLE_DATA
off from code handling real insns.
* final.c (get_attr_length_1): Simplify for JUMP_INSNs.
* function.c (instantiate_virtual_regs): Remove JUMP_TABLE_DATA_P
test, now redundant because JUMP_TABLE_DATA is not an INSN_P insn.
* gcse.c (insert_insn_end_basic_block): Likewise, JUMP_TABLE_DATA_P
is not a NONDEBUG_INSN_P.
* ira-costs.c (scan_one_insn): Likewise.
* jump.c (mark_all_labels): Likewise.
(mark_jump_label_1): Likewise.
* lra-eliminations.c (eliminate_regs_in_insn): Likewise.
* lra.c (get_insn_freq): Expect all insns reaching here to be in
a basic block.
(check_rtl): Remove JUMP_TABLE_DATA_P test, not a NONDEBUG_INSN_P insn.
* predict.c (expensive_function_p): Use FOR_BB_INSNS.
* reload1.c (calculate_needs_all_insns): Call set_label_offsets for
JUMP_TABLE_DATA_P insns.
(calculate_elim_costs_all_insns): Likewise.
(set_label_offsets): Recurse on the PATTERN of JUMP_TABLE_DATA insns.
(elimination_costs_in_insn): Remove redundant JUMP_TABLE_DATA_P test.
(delete_output_reload): Code style fixups.
* reorg.c (dbr_schedule): Move JUMP_TABLE_DATA_P up to avoid setting
insn flags on this non-insn.
* sched-rgn.c (add_branch_dependences): Treat JUMP_TABLE_DATA insns
as scheduling barriers, for pre-change compatibility.
* stmt.c (emit_case_dispatch_table): Emit jump table data not as
JUMP_INSN objects but instead as JUMP_TABLE_DATA objects.
* config/alpha/alpha.c (alpha_does_function_need_gp): Remove
redundant JUMP_TABLE_DATA_P test.
* config/arm/arm.c (thumb_far_jump_used_p): Likewise.
* config/frv/frv.c (frv_function_contains_far_jump): Likewise.
(frv_for_each_packet): Likewise.
* config/i386/i386.c (min_insn_size): Likewise.
(ix86_avoid_jump_mispredicts): Likewise.
* config/m32r/m32r.c (m32r_is_insn): Likewise.
* config/mep/mep.c (mep_reorg_erepeat): Likewise.
* config/mips/mips.c (USEFUL_INSN_P): Likewise.
(mips16_insn_length): Robustify.
(mips_has_long_branch_p): Remove redundant JUMP_TABLE_DATA_P test.
(mips16_split_long_branches): Likewise.
* config/pa/pa.c (pa_combine_instructions): Likewise.
* config/rs6000/rs6000.c (get_next_active_insn): Treat
JUMP_TABLE_DATA objects as active insns, like in active_insn_p.
* config/s390/s390.c (s390_chunkify_start): Treat JUMP_TABLE_DATA
as contributing to pool range lengths.
* config/sh/sh.c (find_barrier): Restore check for ADDR_DIFF_VEC.
Remove redundant JUMP_TABLE_DATA_P test.
(sh_loop_align): Likewise.
(split_branches): Likewise.
(sh_insn_length_adjustment): Likewise.
* config/spu/spu.c (get_branch_target): Likewise.
From-SVN: r197266
+2013-03-30 Steven Bosscher <steven@gcc.gnu.org>
+
+ * cfgrtl.c (fixup_reorder_chain): Do not emit barriers to BB_FOOTER.
+
+ * postreload-gcse.c (bb_has_well_behaved_predecessors): Correct test
+ for table jump at the end of a basic block using tablejump_p.
+ * targhooks.c (default_invalid_within_doloop): Likewise.
+ * config/rs6000/rs6000.c (TARGET_INVALID_WITHIN_DOLOOP): Remove
+ target hook implementation that is identical to the default hook.
+ (rs6000_invalid_within_doloop): Remove.
+
+ * bb-reorder.c (fix_crossing_unconditional_branches): Remove set but
+ unused variable from tablejump_p call.
+
+ * rtl.def (JUMP_TABLE_DATA): New RTX_INSN object.
+ * rtl.h (RTX_PREV, RTX_NEXT): Adjust for new JUMP_TABLE_DATA.
+ (INSN_DELETED_P): Likewise.
+ (emit_jump_table_data): New prototype.
+ * gengtype.c (adjust_field_rtx_def): Handle JUMP_TABLE_DATA fields
+ after 4th as unused.
+ * print-rtl.c (print_rtl): Handle JUMP_TABLE_DATA.
+ * sched-vis.c (print_insn): Likewise.
+ * emit-rtl.c (active_insn_p): Consider JUMP_TABLE_DATA an active
+ insn for compatibility with back ends that use next_active_insn to
+ identify jump table data.
+ (set_insn_deleted): Remove no longer useful JUMP_TABLE_DATA_P check.
+ (remove_insn): Likewise.
+ (emit_insn): Do not accept JUMP_TABLE_DATA objects in insn chains
+ to be emitted.
+ (emit_debug_insn, emit_jump_insn, emit_call_insn, emit_label): Idem.
+ (emit_jump_table_data): New function.
+
+ * cfgbuild.c (inside_basic_block_p): A JUMP_INSN is always inside a
+ basic block, a JUMP_TABLE_DATA never is.
+ (control_flow_insn_p): JUMP_TABLE_DATA is not a control flow insn.
+ * cfgrtl.c (duplicate_insn_chain): Split handling of JUMP_TABLE_DATA
+ off from code handling real insns.
+ * final.c (get_attr_length_1): Simplify for JUMP_INSNs.
+ * function.c (instantiate_virtual_regs): Remove JUMP_TABLE_DATA_P
+ test, now redundant because JUMP_TABLE_DATA is not an INSN_P insn.
+ * gcse.c (insert_insn_end_basic_block): Likewise, JUMP_TABLE_DATA_P
+ is not a NONDEBUG_INSN_P.
+ * ira-costs.c (scan_one_insn): Likewise.
+ * jump.c (mark_all_labels): Likewise.
+ (mark_jump_label_1): Likewise.
+ * lra-eliminations.c (eliminate_regs_in_insn): Likewise.
+ * lra.c (get_insn_freq): Expect all insns reaching here to be in
+ a basic block.
+ (check_rtl): Remove JUMP_TABLE_DATA_P test, not a NONDEBUG_INSN_P insn.
+ * predict.c (expensive_function_p): Use FOR_BB_INSNS.
+ * reload1.c (calculate_needs_all_insns): Call set_label_offsets for
+ JUMP_TABLE_DATA_P insns.
+ (calculate_elim_costs_all_insns): Likewise.
+ (set_label_offsets): Recurse on the PATTERN of JUMP_TABLE_DATA insns.
+ (elimination_costs_in_insn): Remove redundant JUMP_TABLE_DATA_P test.
+ (delete_output_reload): Code style fixups.
+ * reorg.c (dbr_schedule): Move JUMP_TABLE_DATA_P up to avoid setting
+ insn flags on this non-insn.
+ * sched-rgn.c (add_branch_dependences): Treat JUMP_TABLE_DATA insns
+ as scheduling barriers, for pre-change compatibility.
+ * stmt.c (emit_case_dispatch_table): Emit jump table data not as
+ JUMP_INSN objects but instead as JUMP_TABLE_DATA objects.
+
+ * config/alpha/alpha.c (alpha_does_function_need_gp): Remove
+ redundant JUMP_TABLE_DATA_P test.
+ * config/arm/arm.c (thumb_far_jump_used_p): Likewise.
+ * config/frv/frv.c (frv_function_contains_far_jump): Likewise.
+ (frv_for_each_packet): Likewise.
+ * config/i386/i386.c (min_insn_size): Likewise.
+ (ix86_avoid_jump_mispredicts): Likewise.
+ * config/m32r/m32r.c (m32r_is_insn): Likewise.
+ * config/mep/mep.c (mep_reorg_erepeat): Likewise.
+ * config/mips/mips.c (USEFUL_INSN_P): Likewise.
+ (mips16_insn_length): Robustify.
+ (mips_has_long_branch_p): Remove redundant JUMP_TABLE_DATA_P test.
+ (mips16_split_long_branches): Likewise.
+ * config/pa/pa.c (pa_combine_instructions): Likewise.
+ * config/rs6000/rs6000.c (get_next_active_insn): Treat
+ JUMP_TABLE_DATA objects as active insns, like in active_insn_p.
+ * config/s390/s390.c (s390_chunkify_start): Treat JUMP_TABLE_DATA
+ as contributing to pool range lengths.
+ * config/sh/sh.c (find_barrier): Restore check for ADDR_DIFF_VEC.
+ Remove redundant JUMP_TABLE_DATA_P test.
+ (sh_loop_align): Likewise.
+ (split_branches): Likewise.
+ (sh_insn_length_adjustment): Likewise.
+ * config/spu/spu.c (get_branch_target): Likewise.
+
2013-03-29 Jan Hubicka <jh@suse.cz>
* lto-cgraph.c (output_profile_summary, input_profile_summary): Use
if (JUMP_P (last_insn)
&& (succ->flags & EDGE_CROSSING))
{
- rtx label2, table;
+ rtx label2;
gcc_assert (!any_condjump_p (last_insn));
/* Make sure the jump is not already an indirect or table jump. */
if (!computed_jump_p (last_insn)
- && !tablejump_p (last_insn, &label2, &table))
+ && !tablejump_p (last_insn, &label2, NULL))
{
/* We have found a "crossing" unconditional branch. Now
we must convert it to an indirect jump. First create
|| ! JUMP_TABLE_DATA_P (insn));
case JUMP_INSN:
- return (! JUMP_TABLE_DATA_P (insn));
-
case CALL_INSN:
case INSN:
case DEBUG_INSN:
return true;
+ case JUMP_TABLE_DATA:
case BARRIER:
case NOTE:
return false;
return false;
case JUMP_INSN:
- /* Jump insn always causes control transfer except for tablejumps. */
- return (! JUMP_TABLE_DATA_P (insn));
+ return true;
case CALL_INSN:
/* Noreturn and sibling call instructions terminate the basic blocks
return false;
break;
+ case JUMP_TABLE_DATA:
case BARRIER:
- /* It is nonsense to reach barrier when looking for the
+ /* It is nonsense to reach this when looking for the
end of basic block, but before dead code is eliminated
this may happen. */
return false;
break;
case CODE_LABEL:
- /* An addr_vec is placed outside any basic block. */
+ /* An ADDR_VEC is placed outside any basic block. */
if (NEXT_INSN (x)
&& JUMP_TABLE_DATA_P (NEXT_INSN (x)))
x = NEXT_INSN (x);
{
gcc_assert (!onlyjump_p (bb_end_insn)
|| returnjump_p (bb_end_insn));
- BB_FOOTER (bb) = emit_barrier_after (bb_end_insn);
+ emit_barrier_after (bb_end_insn);
continue;
}
rtx
duplicate_insn_chain (rtx from, rtx to)
{
- rtx insn, last, copy;
+ rtx insn, next, last, copy;
/* Avoid updating of boundaries of previous basic block. The
note will get removed from insn stream in fixup. */
case INSN:
case CALL_INSN:
case JUMP_INSN:
- /* Avoid copying of dispatch tables. We never duplicate
- tablejumps, so this can hit only in case the table got
- moved far from original jump. */
- if (JUMP_TABLE_DATA_P (insn))
- {
- /* Avoid copying following barrier as well if any
- (and debug insns in between). */
- rtx next;
-
- for (next = NEXT_INSN (insn);
- next != NEXT_INSN (to);
- next = NEXT_INSN (next))
- if (!DEBUG_INSN_P (next))
- break;
- if (next != NEXT_INSN (to) && BARRIER_P (next))
- insn = next;
- break;
- }
copy = emit_copy_of_insn_after (insn, get_last_insn ());
if (JUMP_P (insn) && JUMP_LABEL (insn) != NULL_RTX
&& ANY_RETURN_P (JUMP_LABEL (insn)))
maybe_copy_prologue_epilogue_insn (insn, copy);
break;
+ case JUMP_TABLE_DATA:
+ /* Avoid copying of dispatch tables. We never duplicate
+ tablejumps, so this can hit only in case the table got
+ moved far from original jump.
+ Avoid copying following barrier as well if any
+ (and debug insns in between). */
+ for (next = NEXT_INSN (insn);
+ next != NEXT_INSN (to);
+ next = NEXT_INSN (next))
+ if (!DEBUG_INSN_P (next))
+ break;
+ if (next != NEXT_INSN (to) && BARRIER_P (next))
+ insn = next;
+ break;
+
case CODE_LABEL:
break;
for (; insn; insn = NEXT_INSN (insn))
if (NONDEBUG_INSN_P (insn)
- && ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& get_attr_usegp (insn))
insn with the far jump attribute set. */
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
{
- if (JUMP_P (insn)
- /* Ignore tablejump patterns. */
- && ! JUMP_TABLE_DATA_P (insn)
- && get_attr_far_jump (insn) == FAR_JUMP_YES
- )
+ if (JUMP_P (insn) && get_attr_far_jump (insn) == FAR_JUMP_YES)
{
/* Record the fact that we have decided that
the function does use far jumps. */
rtx insn = get_insns ();
while (insn != NULL
&& !(JUMP_P (insn)
- /* Ignore tablejump patterns. */
- && ! JUMP_TABLE_DATA_P (insn)
&& get_attr_far_jump (insn) == FAR_JUMP_YES))
insn = NEXT_INSN (insn);
return (insn != NULL);
frv_start_packet_block ();
}
- if (INSN_P (insn) && ! JUMP_TABLE_DATA_P (insn))
+ if (INSN_P (insn))
switch (GET_CODE (PATTERN (insn)))
{
case USE:
if (GET_CODE (PATTERN (insn)) == UNSPEC_VOLATILE
&& XINT (PATTERN (insn), 1) == UNSPECV_ALIGN)
return 0;
- if (JUMP_TABLE_DATA_P (insn))
- return 0;
/* Important case - calls are always 5 bytes.
It is common to have many calls in the row. */
while (nbytes + max_skip >= 16)
{
start = NEXT_INSN (start);
- if ((JUMP_P (start)
- && ! JUMP_TABLE_DATA_P (start))
- || CALL_P (start))
+ if (JUMP_P (start) || CALL_P (start))
njumps--, isjump = 1;
else
isjump = 0;
if (dump_file)
fprintf (dump_file, "Insn %i estimated to %i bytes\n",
INSN_UID (insn), min_size);
- if ((JUMP_P (insn)
- && ! JUMP_TABLE_DATA_P (insn))
- || CALL_P (insn))
+ if (JUMP_P (insn) || CALL_P (insn))
njumps++;
else
continue;
while (njumps > 3)
{
start = NEXT_INSN (start);
- if ((JUMP_P (start)
- && ! JUMP_TABLE_DATA_P (start))
- || CALL_P (start))
+ if (JUMP_P (start) || CALL_P (start))
njumps--, isjump = 1;
else
isjump = 0;
m32r_is_insn (rtx insn)
{
return (NONDEBUG_INSN_P (insn)
- && ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER);
}
for (insn = insns; insn; insn = NEXT_INSN (insn))
if (JUMP_P (insn)
- && ! JUMP_TABLE_DATA_P (insn)
&& mep_invertable_branch_p (insn))
{
if (dump_file)
moved to rtl.h. */
#define USEFUL_INSN_P(INSN) \
(NONDEBUG_INSN_P (INSN) \
- && ! JUMP_TABLE_DATA_P (INSN) \
&& GET_CODE (PATTERN (INSN)) != USE \
&& GET_CODE (PATTERN (INSN)) != CLOBBER)
rtx body = PATTERN (insn);
if (GET_CODE (body) == ADDR_VEC)
return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 0);
- if (GET_CODE (body) == ADDR_DIFF_VEC)
+ else if (GET_CODE (body) == ADDR_DIFF_VEC)
return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 1);
+ else
+ gcc_unreachable ();
}
return get_attr_length (insn);
}
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
FOR_EACH_SUBINSN (subinsn, insn)
if (JUMP_P (subinsn)
- && USEFUL_INSN_P (subinsn)
&& get_attr_length (subinsn) > normal_length
&& (any_condjump_p (subinsn) || any_uncondjump_p (subinsn)))
return true;
something_changed = false;
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
if (JUMP_P (insn)
- && USEFUL_INSN_P (insn)
&& get_attr_length (insn) > 8
&& (any_condjump_p (insn) || any_uncondjump_p (insn)))
{
/* We only care about INSNs, JUMP_INSNs, and CALL_INSNs.
Also ignore any special USE insns. */
if ((! NONJUMP_INSN_P (anchor) && ! JUMP_P (anchor) && ! CALL_P (anchor))
- || JUMP_TABLE_DATA_P (anchor)
|| GET_CODE (PATTERN (anchor)) == USE
|| GET_CODE (PATTERN (anchor)) == CLOBBER)
continue;
continue;
/* Anything except a regular INSN will stop our search. */
- if (! NONJUMP_INSN_P (floater)
- || JUMP_TABLE_DATA_P (floater))
+ if (! NONJUMP_INSN_P (floater))
{
floater = NULL_RTX;
break;
continue;
/* Anything except a regular INSN will stop our search. */
- if (! NONJUMP_INSN_P (floater)
- || JUMP_TABLE_DATA_P (floater))
+ if (! NONJUMP_INSN_P (floater))
{
floater = NULL_RTX;
break;
#undef TARGET_FUNCTION_OK_FOR_SIBCALL
#define TARGET_FUNCTION_OK_FOR_SIBCALL rs6000_function_ok_for_sibcall
-#undef TARGET_INVALID_WITHIN_DOLOOP
-#define TARGET_INVALID_WITHIN_DOLOOP rs6000_invalid_within_doloop
-
#undef TARGET_REGISTER_MOVE_COST
#define TARGET_REGISTER_MOVE_COST rs6000_register_move_cost
#undef TARGET_MEMORY_MOVE_COST
return false;
}
-/* NULL if INSN insn is valid within a low-overhead loop.
- Otherwise return why doloop cannot be applied.
- PowerPC uses the COUNT register for branch on table instructions. */
-
-static const char *
-rs6000_invalid_within_doloop (const_rtx insn)
-{
- if (CALL_P (insn))
- return "Function call in the loop.";
-
- if (JUMP_TABLE_DATA_P (insn))
- return "Computed branch in the loop.";
-
- return NULL;
-}
-
static int
rs6000_ra_ever_killed (void)
{
return NULL_RTX;
if (CALL_P (insn)
- || JUMP_P (insn)
+ || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
}
}
- if (JUMP_P (insn) || LABEL_P (insn))
+ if (JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || LABEL_P (insn))
{
if (curr_pool)
s390_add_pool_insn (curr_pool, insn);
if (found_si > count_si)
count_si = found_si;
}
- else if (JUMP_TABLE_DATA_P (from))
+ else if (JUMP_TABLE_DATA_P (from)
+ && GET_CODE (PATTERN (from)) == ADDR_DIFF_VEC)
{
if ((num_mova > 1 && GET_MODE (prev_nonnote_insn (from)) == VOIDmode)
|| (num_mova
/* There is a possibility that a bf is transformed into a bf/s by the
delay slot scheduler. */
- if (JUMP_P (from) && !JUMP_TABLE_DATA_P (from)
+ if (JUMP_P (from)
&& get_attr_type (from) == TYPE_CBRANCH
&& ! sequence_insn_p (from))
inc += 2;
if (! next
|| ! INSN_P (next)
- || GET_CODE (PATTERN (next)) == ADDR_DIFF_VEC
|| recog_memoized (next) == CODE_FOR_consttable_2)
return 0;
so transform it into a note. */
SET_INSN_DELETED (insn);
}
- else if (JUMP_P (insn)
- /* Don't mess with ADDR_DIFF_VEC */
- && ! JUMP_TABLE_DATA_P (insn))
+ else if (JUMP_P (insn))
{
enum attr_type type = get_attr_type (insn);
if (type == TYPE_CBRANCH)
if (((NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER)
- || CALL_P (insn)
- || (JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn)))
+ || CALL_P (insn) || JUMP_P (insn))
&& ! sequence_insn_p (insn)
&& get_attr_needs_delay_slot (insn) == NEEDS_DELAY_SLOT_YES)
return 2;
/* SH2e has a bug that prevents the use of annulled branches, so if
the delay slot is not filled, we'll have to put a NOP in it. */
if (sh_cpu_attr == CPU_SH2E
- && JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn)
+ && JUMP_P (insn)
&& get_attr_type (insn) == TYPE_CBRANCH
&& ! sequence_insn_p (insn))
return 2;
if (GET_CODE (PATTERN (branch)) == RETURN)
return gen_rtx_REG (SImode, LINK_REGISTER_REGNUM);
- /* jump table */
- if (JUMP_TABLE_DATA_P (branch))
- return 0;
-
/* ASM GOTOs. */
if (extract_asm_operands (PATTERN (branch)) != NULL)
return NULL;
active_insn_p (const_rtx insn)
{
return (CALL_P (insn) || JUMP_P (insn)
+ || JUMP_TABLE_DATA_P (insn) /* FIXME */
|| (NONJUMP_INSN_P (insn)
&& (! reload_completed
|| (GET_CODE (PATTERN (insn)) != USE
void
set_insn_deleted (rtx insn)
{
- if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn))
+ if (INSN_P (insn))
df_insn_delete (insn);
PUT_CODE (insn, NOTE);
NOTE_KIND (insn) = NOTE_INSN_DELETED;
}
/* Invalidate data flow information associated with INSN. */
- if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn))
+ if (INSN_P (insn))
df_insn_delete (insn);
/* Fix up basic block boundaries, if necessary. */
break;
#ifdef ENABLE_RTL_CHECKING
+ case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
break;
#ifdef ENABLE_RTL_CHECKING
+ case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
break;
#ifdef ENABLE_RTL_CHECKING
+ case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
#ifdef ENABLE_RTL_CHECKING
case SEQUENCE:
+ case JUMP_TABLE_DATA:
gcc_unreachable ();
break;
#endif
return label;
}
+/* Make an insn of code JUMP_TABLE_DATA
+ and add it to the end of the doubly-linked list. */
+
+rtx
+emit_jump_table_data (rtx table)
+{
+ rtx jump_table_data = rtx_alloc (JUMP_TABLE_DATA);
+ INSN_UID (jump_table_data) = cur_insn_uid++;
+ PATTERN (jump_table_data) = table;
+ BLOCK_FOR_INSN (jump_table_data) = NULL;
+ add_insn (jump_table_data);
+ return jump_table_data;
+}
+
/* Make an insn of code BARRIER
and add it to the end of the doubly-linked list. */
return 0;
case CALL_INSN:
- length = fallback_fn (insn);
- break;
-
case JUMP_INSN:
- body = PATTERN (insn);
- if (JUMP_TABLE_DATA_P (insn))
- {
- /* Alignment is machine-dependent and should be handled by
- ADDR_VEC_ALIGN. */
- }
- else
- length = fallback_fn (insn);
+ length = fallback_fn (insn);
break;
case INSN:
{
/* These patterns in the instruction stream can never be recognized.
Fortunately, they shouldn't contain virtual registers either. */
- if (JUMP_TABLE_DATA_P (insn)
- || GET_CODE (PATTERN (insn)) == USE
+ if (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT)
continue;
&& (!single_succ_p (bb)
|| single_succ_edge (bb)->flags & EDGE_ABNORMAL)))
{
-#ifdef HAVE_cc0
- rtx note;
-#endif
-
- /* If this is a jump table, then we can't insert stuff here. Since
- we know the previous real insn must be the tablejump, we insert
- the new instruction just before the tablejump. */
- if (JUMP_TABLE_DATA_P (insn))
- insn = prev_active_insn (insn);
-
#ifdef HAVE_cc0
/* FIXME: 'twould be nice to call prev_cc0_setter here but it aborts
if cc0 isn't set. */
- note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX);
+ rtx note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX);
if (note)
insn = XEXP (note, 0);
else
t = scalar_tp, subname = "rt_int";
else if (i == SYMBOL_REF && aindex == 2)
t = symbol_union_tp, subname = "";
+ else if (i == JUMP_TABLE_DATA && aindex >= 5)
+ t = scalar_tp, subname = "rt_int";
else if (i == BARRIER && aindex >= 3)
t = scalar_tp, subname = "rt_int";
else if (i == ENTRY_VALUE && aindex == 0)
int i, k;
bool counted_mem;
- if (!NONDEBUG_INSN_P (insn)
- || JUMP_TABLE_DATA_P (insn))
+ if (!NONDEBUG_INSN_P (insn))
return insn;
pat_code = GET_CODE (PATTERN (insn));
basic blocks. If those non-insns represent tablejump data,
they contain label references that we must record. */
for (insn = BB_HEADER (bb); insn; insn = NEXT_INSN (insn))
- if (INSN_P (insn))
- {
- gcc_assert (JUMP_TABLE_DATA_P (insn));
- mark_jump_label (PATTERN (insn), insn, 0);
- }
+ if (JUMP_TABLE_DATA_P (insn))
+ mark_jump_label (PATTERN (insn), insn, 0);
for (insn = BB_FOOTER (bb); insn; insn = NEXT_INSN (insn))
- if (INSN_P (insn))
- {
- gcc_assert (JUMP_TABLE_DATA_P (insn));
- mark_jump_label (PATTERN (insn), insn, 0);
- }
+ if (JUMP_TABLE_DATA_P (insn))
+ mark_jump_label (PATTERN (insn), insn, 0);
}
}
else
;
else if (LABEL_P (insn))
prev_nonjump_insn = NULL;
+ else if (JUMP_TABLE_DATA_P (insn))
+ mark_jump_label (PATTERN (insn), insn, 0);
else if (NONDEBUG_INSN_P (insn))
{
mark_jump_label (PATTERN (insn), insn, 0);
return;
}
- /* Do walk the labels in a vector, but not the first operand of an
- ADDR_DIFF_VEC. Don't set the JUMP_LABEL of a vector. */
+ /* Do walk the labels in a vector, but not the first operand of an
+ ADDR_DIFF_VEC. Don't set the JUMP_LABEL of a vector. */
case ADDR_VEC:
case ADDR_DIFF_VEC:
if (! INSN_DELETED_P (insn))
if (icode < 0 && asm_noperands (PATTERN (insn)) < 0 && ! DEBUG_INSN_P (insn))
{
- lra_assert (JUMP_TABLE_DATA_P (insn)
- || GET_CODE (PATTERN (insn)) == USE
+ lra_assert (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT);
return;
static int
get_insn_freq (rtx insn)
{
- basic_block bb;
+ basic_block bb = BLOCK_FOR_INSN (insn);
- if ((bb = BLOCK_FOR_INSN (insn)) != NULL)
- return REG_FREQ_FROM_BB (bb);
- else
- {
- lra_assert (lra_insn_recog_data[INSN_UID (insn)]
- ->insn_static_data->n_operands == 0);
- /* We don't care about such insn, e.g. it might be jump with
- addr_vec. */
- return 1;
- }
+ gcc_checking_assert (bb != NULL);
+ return REG_FREQ_FROM_BB (bb);
}
/* Invalidate all reg info of INSN with DATA and execution frequency
FOR_EACH_BB (bb)
FOR_BB_INSNS (bb, insn)
if (NONDEBUG_INSN_P (insn)
- && ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& GET_CODE (PATTERN (insn)) != ASM_INPUT)
if ((pred->flags & EDGE_ABNORMAL_CALL) && cfun->has_nonlocal_label)
return false;
- if (JUMP_TABLE_DATA_P (BB_END (pred->src)))
+ if (tablejump_p (BB_END (pred->src), NULL, NULL))
return false;
}
return true;
{
rtx insn;
- for (insn = BB_HEAD (bb); insn != NEXT_INSN (BB_END (bb));
- insn = NEXT_INSN (insn))
+ FOR_BB_INSNS (bb, insn)
if (active_insn_p (insn))
{
sum += bb->frequency;
case CALL_INSN:
case NOTE:
case CODE_LABEL:
+ case JUMP_TABLE_DATA:
case BARRIER:
for (tmp_rtx = rtx_first; tmp_rtx != 0; tmp_rtx = NEXT_INSN (tmp_rtx))
{
include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see
what effects this has on the known offsets at labels. */
- if (LABEL_P (insn) || JUMP_P (insn)
+ if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (INSN_P (insn) && REG_NOTES (insn) != 0))
set_label_offsets (insn, insn, 0);
include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see
what effects this has on the known offsets at labels. */
- if (LABEL_P (insn) || JUMP_P (insn)
+ if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (INSN_P (insn) && REG_NOTES (insn) != 0))
set_label_offsets (insn, insn, 0);
return;
+ case JUMP_TABLE_DATA:
+ set_label_offsets (PATTERN (insn), insn, initial_p);
+ return;
+
case JUMP_INSN:
set_label_offsets (PATTERN (insn), insn, initial_p);
if (! insn_is_asm && icode < 0)
{
- gcc_assert (JUMP_TABLE_DATA_P (insn)
+ gcc_assert (DEBUG_INSN_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
- || GET_CODE (PATTERN (insn)) == ASM_INPUT
- || DEBUG_INSN_P (insn));
+ || GET_CODE (PATTERN (insn)) == ASM_INPUT);
if (DEBUG_INSN_P (insn))
INSN_VAR_LOCATION_LOC (insn)
= eliminate_regs (INSN_VAR_LOCATION_LOC (insn), VOIDmode, insn);
if (! insn_is_asm && icode < 0)
{
- gcc_assert (JUMP_TABLE_DATA_P (insn)
+ gcc_assert (DEBUG_INSN_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
- || GET_CODE (PATTERN (insn)) == ASM_INPUT
- || DEBUG_INSN_P (insn));
+ || GET_CODE (PATTERN (insn)) == ASM_INPUT);
return;
}
since if they are the only uses, they are dead. */
if (set != 0 && SET_DEST (set) == reg)
continue;
- if (LABEL_P (i2)
- || JUMP_P (i2))
+ if (LABEL_P (i2) || JUMP_P (i2))
break;
if ((NONJUMP_INSN_P (i2) || CALL_P (i2))
&& reg_mentioned_p (reg, PATTERN (i2)))
delete_address_reloads (i2, insn);
delete_insn (i2);
}
- if (LABEL_P (i2)
- || JUMP_P (i2))
+ if (LABEL_P (i2) || JUMP_P (i2))
break;
}
{
rtx target;
- if (JUMP_P (insn))
- INSN_ANNULLED_BRANCH_P (insn) = 0;
- INSN_FROM_TARGET_P (insn) = 0;
-
/* Skip vector tables. We can't get attributes for them. */
if (JUMP_TABLE_DATA_P (insn))
continue;
+ if (JUMP_P (insn))
+ INSN_ANNULLED_BRANCH_P (insn) = 0;
+ INSN_FROM_TARGET_P (insn) = 0;
+
if (num_delay_slots (insn) > 0)
obstack_ptr_grow (&unfilled_slots_obstack, insn);
RTX_BITFIELD_OPS
an rtx code for a bit-field operation (ZERO_EXTRACT, SIGN_EXTRACT)
RTX_INSN
- an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN)
+ an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN) or
+ data that will be output as assembly pseudo-ops (DEBUG_INSN)
RTX_MATCH
an rtx code for something that matches in insns (e.g, MATCH_DUP)
RTX_AUTOINC
All other fields ( rtx->u.fld[] ) have exact same meaning as INSN's. */
DEF_RTL_EXPR(CALL_INSN, "call_insn", "iuuBeiiee", RTX_INSN)
+/* Placeholder for tablejump JUMP_INSNs. The pattern of this kind
+ of rtx is always either an ADDR_VEC or an ADDR_DIFF_VEC. These
+ placeholders do not appear as real instructions inside a basic
+ block, but are considered active_insn_p instructions for historical
+ reasons, when jump table data was represented with JUMP_INSNs. */
+DEF_RTL_EXPR(JUMP_TABLE_DATA, "jump_table_data", "iuuBe0000", RTX_INSN)
+
/* A marker that indicates that control will not flow through. */
DEF_RTL_EXPR(BARRIER, "barrier", "iuu00000", RTX_EXTRA)
/* Similar, but a volatile operation and one which may trap. */
DEF_RTL_EXPR(UNSPEC_VOLATILE, "unspec_volatile", "Ei", RTX_EXTRA)
-/* Vector of addresses, stored as full words. */
-/* Each element is a LABEL_REF to a CODE_LABEL whose address we want. */
+/* ----------------------------------------------------------------------
+ Table jump addresses.
+ ---------------------------------------------------------------------- */
+
+/* Vector of addresses, stored as full words.
+ Each element is a LABEL_REF to a CODE_LABEL whose address we want. */
DEF_RTL_EXPR(ADDR_VEC, "addr_vec", "E", RTX_EXTRA)
/* Vector of address differences X0 - BASE, X1 - BASE, ...
The third, fourth and fifth operands are only valid when
CASE_VECTOR_SHORTEN_MODE is defined, and only in an optimizing
compilation. */
-
DEF_RTL_EXPR(ADDR_DIFF_VEC, "addr_diff_vec", "eEee0", RTX_EXTRA)
/* Memory prefetch, with attributes supported on some targets.
*/
#define RTX_PREV(X) ((INSN_P (X) \
|| NOTE_P (X) \
+ || JUMP_TABLE_DATA_P (X) \
|| BARRIER_P (X) \
|| LABEL_P (X)) \
&& PREV_INSN (X) != NULL \
#define BARRIER_P(X) (GET_CODE (X) == BARRIER)
/* Predicate yielding nonzero iff X is a data for a jump table. */
-#define JUMP_TABLE_DATA_P(INSN) \
- (JUMP_P (INSN) && (GET_CODE (PATTERN (INSN)) == ADDR_VEC || \
- GET_CODE (PATTERN (INSN)) == ADDR_DIFF_VEC))
+#define JUMP_TABLE_DATA_P(INSN) (GET_CODE (INSN) == JUMP_TABLE_DATA)
/* Predicate yielding nonzero iff X is a return or simple_return. */
#define ANY_RETURN_P(X) \
/* 1 if RTX is an insn that has been deleted. */
#define INSN_DELETED_P(RTX) \
- (RTL_FLAG_CHECK7("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \
- CALL_INSN, JUMP_INSN, \
+ (RTL_FLAG_CHECK8("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \
+ CALL_INSN, JUMP_INSN, JUMP_TABLE_DATA, \
CODE_LABEL, BARRIER, NOTE)->volatil)
/* 1 if RTX is a call to a const function. Built from ECF_CONST and
/* 1 if RTX is a call to a looping const or pure function. Built from
ECF_LOOPING_CONST_OR_PURE and DECL_LOOPING_CONST_OR_PURE_P. */
-#define RTL_LOOPING_CONST_OR_PURE_CALL_P(RTX) \
+#define RTL_LOOPING_CONST_OR_PURE_CALL_P(RTX) \
(RTL_FLAG_CHECK1("CONST_OR_PURE_CALL_P", (RTX), CALL_INSN)->call)
/* 1 if RTX is a call_insn for a sibling call. */
extern rtx emit_jump_insn (rtx);
extern rtx emit_call_insn (rtx);
extern rtx emit_label (rtx);
+extern rtx emit_jump_table_data (rtx);
extern rtx emit_barrier (void);
extern rtx emit_note (enum insn_note);
extern rtx emit_note_copy (rtx);
insn = tail;
last = 0;
while (CALL_P (insn)
- || JUMP_P (insn)
+ || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (NONJUMP_INSN_P (insn)
&& (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
possible improvement for handling COND_EXECs in this scheduler: it
could remove always-true predicates. */
- if (!reload_completed || ! JUMP_P (tail))
+ if (!reload_completed || ! (JUMP_P (tail) || JUMP_TABLE_DATA_P (tail)))
return;
insn = tail;
case CODE_LABEL:
pp_printf (pp, "L%d:", INSN_UID (x));
break;
+ case JUMP_TABLE_DATA:
+ pp_string (pp, "jump_table_data{\n");
+ print_pattern (pp, PATTERN (x), verbose);
+ pp_string (pp, "}");
+ break;
case BARRIER:
pp_string (pp, "barrier");
break;
emit_label (table_label);
if (CASE_VECTOR_PC_RELATIVE || flag_pic)
- emit_jump_insn (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE,
- gen_rtx_LABEL_REF (Pmode, table_label),
- gen_rtvec_v (ncases, labelvec),
- const0_rtx, const0_rtx));
+ emit_jump_table_data (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE,
+ gen_rtx_LABEL_REF (Pmode,
+ table_label),
+ gen_rtvec_v (ncases, labelvec),
+ const0_rtx, const0_rtx));
else
- emit_jump_insn (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE,
- gen_rtvec_v (ncases, labelvec)));
+ emit_jump_table_data (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE,
+ gen_rtvec_v (ncases, labelvec)));
/* Record no drop-through after the table. */
emit_barrier ();
if (CALL_P (insn))
return "Function call in loop.";
- if (JUMP_TABLE_DATA_P (insn))
+ if (tablejump_p (insn, NULL, NULL) || computed_jump_p (insn))
return "Computed branch in the loop.";
return NULL;