summaryrefslogtreecommitdiff
path: root/gcc/ira-lives.c
diff options
context:
space:
mode:
authorRichard Sandiford <richard.sandiford@arm.com>2019-09-30 16:20:52 +0000
committerRichard Sandiford <rsandifo@gcc.gnu.org>2019-09-30 16:20:52 +0000
commit6c47622219d6386807b26890dcdc84f192499d33 (patch)
tree9ea524a6474c5ed4540c9d867042bd4df6f45c85 /gcc/ira-lives.c
parent7450506b5d48642a71459cfc24efcea6ca58e97e (diff)
Remove global call sets: IRA
For -fipa-ra, IRA already keeps track of which specific registers are call-clobbered in a region, rather than using global information. The patch generalises this so that it tracks which ABIs are used by calls in the region. We can then use the new ABI descriptors to handle partially-clobbered registers in the same way as fully-clobbered registers, without having special code for targetm.hard_regno_call_part_clobbered. This in turn makes -fipa-ra work for partially-clobbered registers too. A side-effect of allowing multiple ABIs is that we no longer have an obvious set of conflicting registers for the self-described "fragile hack" in ira-constraints.c. This code kicks in for user-defined registers that aren't live across a call at -O0, and it tries to avoid allocating a call-clobbered register to them. Here I've used the set of call-clobbered registers in the current function's ABI, applying on top of any registers that are clobbered by called functions. This is enough to keep gcc.dg/debug/dwarf2/pr5948.c happy. The handling of GENERIC_STACK_CHECK in do_reload seemed to have a reversed condition: for (int i = 0; i < FIRST_PSEUDO_REGISTER; i++) if (df_regs_ever_live_p (i) && !fixed_regs[i] && call_used_or_fixed_reg_p (i)) size += UNITS_PER_WORD; The final part of the condition counts registers that don't need to be saved in the prologue, but I think the opposite was intended. 2019-09-30 Richard Sandiford <richard.sandiford@arm.com> gcc/ * function-abi.h (call_clobbers_in_region): Declare. (call_clobbered_in_region_p): New function. * function-abi.cc (call_clobbers_in_region): Likewise. * ira-int.h: Include function-abi.h. (ira_allocno::crossed_calls_abis): New field. (ALLOCNO_CROSSED_CALLS_ABIS): New macro. (ira_need_caller_save_regs): New function. (ira_need_caller_save_p): Likewise. * ira.c (setup_reg_renumber): Use ira_need_caller_save_p instead of call_used_or_fixed_regs. (do_reload): Use crtl->abi to test whether the current function needs to save a register in the prologue. Count registers that need to be saved rather than registers that don't. * ira-build.c (create_cap_allocno): Copy ALLOCNO_CROSSED_CALLS_ABIS. Remove unnecessary | from ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS. (propagate_allocno_info): Merge ALLOCNO_CROSSED_CALLS_ABIS too. (propagate_some_info_from_allocno): Likewise. (copy_info_to_removed_store_destinations): Likewise. (ira_flattening): Say that ALLOCNO_CROSSED_CALLS_ABIS and ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS are handled conservatively. (ira_build): Use ira_need_caller_save_regs instead of call_used_or_fixed_regs. * ira-color.c (calculate_saved_nregs): Use crtl->abi to test whether the current function would need to save a register before using it. (calculate_spill_cost): Likewise. (allocno_reload_assign): Use ira_need_caller_save_regs and ira_need_caller_save_p instead of call_used_or_fixed_regs. * ira-conflicts.c (ira_build_conflicts): Use ira_need_caller_save_regs rather than call_used_or_fixed_regs as the set of call-clobbered registers. Remove the call_used_or_fixed_regs mask from the calculation of temp_hard_reg_set and mask its use instead. Remove special handling of partially-clobbered registers. * ira-costs.c (ira_tune_allocno_costs): Use ira_need_caller_save_p. * ira-lives.c (process_bb_node_lives): Use mode_clobbers to calculate the set of conflicting registers for calls that can throw. Record the ABIs of calls in ALLOCNO_CROSSED_CALLS_ABIS. Use full_and_partial_reg_clobbers rather than full_reg_clobbers for the calculation of ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS. Use eh_edge_abi to calculate the set of registers that could be clobbered by an EH edge. Include partially-clobbered as well as fully-clobbered registers. From-SVN: r276325
Diffstat (limited to 'gcc/ira-lives.c')
-rw-r--r--gcc/ira-lives.c24
1 files changed, 11 insertions, 13 deletions
diff --git a/gcc/ira-lives.c b/gcc/ira-lives.c
index e24831a207c..cce73a1c3d4 100644
--- a/gcc/ira-lives.c
+++ b/gcc/ira-lives.c
@@ -1255,11 +1255,7 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
ira_object_t obj = ira_object_id_map[i];
a = OBJECT_ALLOCNO (obj);
int num = ALLOCNO_NUM (a);
- HARD_REG_SET this_call_used_reg_set
- = insn_callee_abi (insn).full_reg_clobbers ();
- /* ??? This preserves traditional behavior; it might not be
- needed. */
- this_call_used_reg_set |= fixed_reg_set;
+ function_abi callee_abi = insn_callee_abi (insn);
/* Don't allocate allocnos that cross setjmps or any
call, if this function receives a nonlocal
@@ -1275,9 +1271,9 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
if (can_throw_internal (insn))
{
OBJECT_CONFLICT_HARD_REGS (obj)
- |= this_call_used_reg_set;
+ |= callee_abi.mode_clobbers (ALLOCNO_MODE (a));
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj)
- |= this_call_used_reg_set;
+ |= callee_abi.mode_clobbers (ALLOCNO_MODE (a));
}
if (sparseset_bit_p (allocnos_processed, num))
@@ -1294,8 +1290,9 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
/* Mark it as saved at the next call. */
allocno_saved_at_call[num] = last_call_num + 1;
ALLOCNO_CALLS_CROSSED_NUM (a)++;
+ ALLOCNO_CROSSED_CALLS_ABIS (a) |= 1 << callee_abi.id ();
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a)
- |= this_call_used_reg_set;
+ |= callee_abi.full_and_partial_reg_clobbers ();
if (cheap_reg != NULL_RTX
&& ALLOCNO_REGNO (a) == (int) REGNO (cheap_reg))
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a)++;
@@ -1359,10 +1356,11 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
}
/* Allocnos can't go in stack regs at the start of a basic block
- that is reached by an abnormal edge. Likewise for call
- clobbered regs, because caller-save, fixup_abnormal_edges and
- possibly the table driven EH machinery are not quite ready to
- handle such allocnos live across such edges. */
+ that is reached by an abnormal edge. Likewise for registers
+ that are at least partly call clobbered, because caller-save,
+ fixup_abnormal_edges and possibly the table driven EH machinery
+ are not quite ready to handle such allocnos live across such
+ edges. */
if (bb_has_abnormal_pred (bb))
{
#ifdef STACK_REGS
@@ -1382,7 +1380,7 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
if (!cfun->has_nonlocal_label
&& has_abnormal_call_or_eh_pred_edge_p (bb))
for (px = 0; px < FIRST_PSEUDO_REGISTER; px++)
- if (call_used_or_fixed_reg_p (px)
+ if (eh_edge_abi.clobbers_at_least_part_of_reg_p (px)
#ifdef REAL_PIC_OFFSET_TABLE_REGNUM
/* We should create a conflict of PIC pseudo with
PIC hard reg as PIC hard reg can have a wrong