[PATCH 3/7] OpenMP 4.0 offloading infrastructure: Offload tables.
[gcc.git] / gcc / cgraphunit.c
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
4
5 This file is part of GCC.
6
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
11
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
16
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
20
21 /* This module implements main driver of compilation process.
22
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
25
26 The front-end is supposed to use following functionality:
27
28 - finalize_function
29
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
32
33 (There is one exception needed for implementing GCC extern inline
34 function.)
35
36 - varpool_finalize_decl
37
38 This function has same behavior as the above but is used for static
39 variables.
40
41 - add_asm_node
42
43 Insert new toplevel ASM statement
44
45 - finalize_compilation_unit
46
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
49
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
54
55 At the end the bodies of unreachable functions are removed.
56
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
59
60 - compile
61
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
67
68 Compile time:
69
70 1) Inter-procedural optimization.
71 (ipa_passes)
72
73 This part is further split into:
74
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
77
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
82
83 b) early small interprocedural passes.
84
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
88
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
91
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
97
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
100
101 Compile time and or linktime analysis stage (WPA):
102
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
111
112 Compile time and/or parallel linktime stage (ltrans)
113
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
117
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
120
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
127
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
130
131 4) late small IP passes
132
133 Simple IP passes working within single program partition.
134
135 5) Expansion
136 (expand_all_functions)
137
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
143
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
146
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
153
154 - cgraph_function_versioning
155
156 produces a copy of function into new one (a version)
157 and apply simple transformations
158 */
159
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "predict.h"
171 #include "vec.h"
172 #include "hashtab.h"
173 #include "hash-set.h"
174 #include "machmode.h"
175 #include "hard-reg-set.h"
176 #include "input.h"
177 #include "function.h"
178 #include "basic-block.h"
179 #include "tree-ssa-alias.h"
180 #include "internal-fn.h"
181 #include "gimple-fold.h"
182 #include "gimple-expr.h"
183 #include "is-a.h"
184 #include "gimple.h"
185 #include "gimplify.h"
186 #include "gimple-iterator.h"
187 #include "gimplify-me.h"
188 #include "gimple-ssa.h"
189 #include "tree-cfg.h"
190 #include "tree-into-ssa.h"
191 #include "tree-ssa.h"
192 #include "tree-inline.h"
193 #include "langhooks.h"
194 #include "toplev.h"
195 #include "flags.h"
196 #include "debug.h"
197 #include "target.h"
198 #include "diagnostic.h"
199 #include "params.h"
200 #include "fibheap.h"
201 #include "intl.h"
202 #include "hash-map.h"
203 #include "plugin-api.h"
204 #include "ipa-ref.h"
205 #include "cgraph.h"
206 #include "alloc-pool.h"
207 #include "ipa-prop.h"
208 #include "tree-iterator.h"
209 #include "tree-pass.h"
210 #include "tree-dump.h"
211 #include "gimple-pretty-print.h"
212 #include "output.h"
213 #include "coverage.h"
214 #include "plugin.h"
215 #include "ipa-inline.h"
216 #include "ipa-utils.h"
217 #include "lto-streamer.h"
218 #include "except.h"
219 #include "cfgloop.h"
220 #include "regset.h" /* FIXME: For reg_obstack. */
221 #include "context.h"
222 #include "pass_manager.h"
223 #include "tree-nested.h"
224 #include "gimplify.h"
225 #include "dbgcnt.h"
226 #include "tree-chkp.h"
227 #include "lto-section-names.h"
228 #include "omp-low.h"
229
230 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
231 secondary queue used during optimization to accommodate passes that
232 may generate new functions that need to be optimized and expanded. */
233 vec<cgraph_node *> cgraph_new_nodes;
234
235 static void expand_all_functions (void);
236 static void mark_functions_to_output (void);
237 static void handle_alias_pairs (void);
238
239 /* Used for vtable lookup in thunk adjusting. */
240 static GTY (()) tree vtable_entry_type;
241
242 /* Determine if symbol declaration is needed. That is, visible to something
243 either outside this translation unit, something magic in the system
244 configury */
245 bool
246 symtab_node::needed_p (void)
247 {
248 /* Double check that no one output the function into assembly file
249 early. */
250 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
251 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
252
253 if (!definition)
254 return false;
255
256 if (DECL_EXTERNAL (decl))
257 return false;
258
259 /* If the user told us it is used, then it must be so. */
260 if (force_output)
261 return true;
262
263 /* ABI forced symbols are needed when they are external. */
264 if (forced_by_abi && TREE_PUBLIC (decl))
265 return true;
266
267 /* Keep constructors, destructors and virtual functions. */
268 if (TREE_CODE (decl) == FUNCTION_DECL
269 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
270 return true;
271
272 /* Externally visible variables must be output. The exception is
273 COMDAT variables that must be output only when they are needed. */
274 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
275 return true;
276
277 return false;
278 }
279
280 /* Head and terminator of the queue of nodes to be processed while building
281 callgraph. */
282
283 static symtab_node symtab_terminator;
284 static symtab_node *queued_nodes = &symtab_terminator;
285
286 /* Add NODE to queue starting at QUEUED_NODES.
287 The queue is linked via AUX pointers and terminated by pointer to 1. */
288
289 static void
290 enqueue_node (symtab_node *node)
291 {
292 if (node->aux)
293 return;
294 gcc_checking_assert (queued_nodes);
295 node->aux = queued_nodes;
296 queued_nodes = node;
297 }
298
299 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
300 functions into callgraph in a way so they look like ordinary reachable
301 functions inserted into callgraph already at construction time. */
302
303 void
304 symbol_table::process_new_functions (void)
305 {
306 tree fndecl;
307
308 if (!cgraph_new_nodes.exists ())
309 return;
310
311 handle_alias_pairs ();
312 /* Note that this queue may grow as its being processed, as the new
313 functions may generate new ones. */
314 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
315 {
316 cgraph_node *node = cgraph_new_nodes[i];
317 fndecl = node->decl;
318 switch (state)
319 {
320 case CONSTRUCTION:
321 /* At construction time we just need to finalize function and move
322 it into reachable functions list. */
323
324 cgraph_node::finalize_function (fndecl, false);
325 call_cgraph_insertion_hooks (node);
326 enqueue_node (node);
327 break;
328
329 case IPA:
330 case IPA_SSA:
331 /* When IPA optimization already started, do all essential
332 transformations that has been already performed on the whole
333 cgraph but not on this function. */
334
335 gimple_register_cfg_hooks ();
336 if (!node->analyzed)
337 node->analyze ();
338 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
339 if (state == IPA_SSA
340 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
341 g->get_passes ()->execute_early_local_passes ();
342 else if (inline_summary_vec != NULL)
343 compute_inline_parameters (node, true);
344 free_dominance_info (CDI_POST_DOMINATORS);
345 free_dominance_info (CDI_DOMINATORS);
346 pop_cfun ();
347 call_cgraph_insertion_hooks (node);
348 break;
349
350 case EXPANSION:
351 /* Functions created during expansion shall be compiled
352 directly. */
353 node->process = 0;
354 call_cgraph_insertion_hooks (node);
355 node->expand ();
356 break;
357
358 default:
359 gcc_unreachable ();
360 break;
361 }
362 }
363
364 cgraph_new_nodes.release ();
365 }
366
367 /* As an GCC extension we allow redefinition of the function. The
368 semantics when both copies of bodies differ is not well defined.
369 We replace the old body with new body so in unit at a time mode
370 we always use new body, while in normal mode we may end up with
371 old body inlined into some functions and new body expanded and
372 inlined in others.
373
374 ??? It may make more sense to use one body for inlining and other
375 body for expanding the function but this is difficult to do. */
376
377 void
378 cgraph_node::reset (void)
379 {
380 /* If process is set, then we have already begun whole-unit analysis.
381 This is *not* testing for whether we've already emitted the function.
382 That case can be sort-of legitimately seen with real function redefinition
383 errors. I would argue that the front end should never present us with
384 such a case, but don't enforce that for now. */
385 gcc_assert (!process);
386
387 /* Reset our data structures so we can analyze the function again. */
388 memset (&local, 0, sizeof (local));
389 memset (&global, 0, sizeof (global));
390 memset (&rtl, 0, sizeof (rtl));
391 analyzed = false;
392 definition = false;
393 alias = false;
394 weakref = false;
395 cpp_implicit_alias = false;
396
397 remove_callees ();
398 remove_all_references ();
399 }
400
401 /* Return true when there are references to the node. */
402
403 bool
404 symtab_node::referred_to_p (void)
405 {
406 ipa_ref *ref = NULL;
407
408 /* See if there are any references at all. */
409 if (iterate_referring (0, ref))
410 return true;
411 /* For functions check also calls. */
412 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
413 if (cn && cn->callers)
414 return true;
415 return false;
416 }
417
418 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
419 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
420 the garbage collector run at the moment. We would need to either create
421 a new GC context, or just not compile right now. */
422
423 void
424 cgraph_node::finalize_function (tree decl, bool no_collect)
425 {
426 cgraph_node *node = cgraph_node::get_create (decl);
427
428 if (node->definition)
429 {
430 /* Nested functions should only be defined once. */
431 gcc_assert (!DECL_CONTEXT (decl)
432 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
433 node->reset ();
434 node->local.redefined_extern_inline = true;
435 }
436
437 notice_global_symbol (decl);
438 node->definition = true;
439 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
440
441 /* With -fkeep-inline-functions we are keeping all inline functions except
442 for extern inline ones. */
443 if (flag_keep_inline_functions
444 && DECL_DECLARED_INLINE_P (decl)
445 && !DECL_EXTERNAL (decl)
446 && !DECL_DISREGARD_INLINE_LIMITS (decl))
447 node->force_output = 1;
448
449 /* When not optimizing, also output the static functions. (see
450 PR24561), but don't do so for always_inline functions, functions
451 declared inline and nested functions. These were optimized out
452 in the original implementation and it is unclear whether we want
453 to change the behavior here. */
454 if ((!optimize
455 && !node->cpp_implicit_alias
456 && !DECL_DISREGARD_INLINE_LIMITS (decl)
457 && !DECL_DECLARED_INLINE_P (decl)
458 && !(DECL_CONTEXT (decl)
459 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
460 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
461 node->force_output = 1;
462
463 /* If we've not yet emitted decl, tell the debug info about it. */
464 if (!TREE_ASM_WRITTEN (decl))
465 (*debug_hooks->deferred_inline_function) (decl);
466
467 /* Possibly warn about unused parameters. */
468 if (warn_unused_parameter)
469 do_warn_unused_parameter (decl);
470
471 if (!no_collect)
472 ggc_collect ();
473
474 if (symtab->state == CONSTRUCTION
475 && (node->needed_p () || node->referred_to_p ()))
476 enqueue_node (node);
477 }
478
479 /* Add the function FNDECL to the call graph.
480 Unlike finalize_function, this function is intended to be used
481 by middle end and allows insertion of new function at arbitrary point
482 of compilation. The function can be either in high, low or SSA form
483 GIMPLE.
484
485 The function is assumed to be reachable and have address taken (so no
486 API breaking optimizations are performed on it).
487
488 Main work done by this function is to enqueue the function for later
489 processing to avoid need the passes to be re-entrant. */
490
491 void
492 cgraph_node::add_new_function (tree fndecl, bool lowered)
493 {
494 gcc::pass_manager *passes = g->get_passes ();
495 cgraph_node *node;
496 switch (symtab->state)
497 {
498 case PARSING:
499 cgraph_node::finalize_function (fndecl, false);
500 break;
501 case CONSTRUCTION:
502 /* Just enqueue function to be processed at nearest occurrence. */
503 node = cgraph_node::get_create (fndecl);
504 if (lowered)
505 node->lowered = true;
506 cgraph_new_nodes.safe_push (node);
507 break;
508
509 case IPA:
510 case IPA_SSA:
511 case EXPANSION:
512 /* Bring the function into finalized state and enqueue for later
513 analyzing and compilation. */
514 node = cgraph_node::get_create (fndecl);
515 node->local.local = false;
516 node->definition = true;
517 node->force_output = true;
518 if (!lowered && symtab->state == EXPANSION)
519 {
520 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
521 gimple_register_cfg_hooks ();
522 bitmap_obstack_initialize (NULL);
523 execute_pass_list (cfun, passes->all_lowering_passes);
524 passes->execute_early_local_passes ();
525 bitmap_obstack_release (NULL);
526 pop_cfun ();
527
528 lowered = true;
529 }
530 if (lowered)
531 node->lowered = true;
532 cgraph_new_nodes.safe_push (node);
533 break;
534
535 case FINISHED:
536 /* At the very end of compilation we have to do all the work up
537 to expansion. */
538 node = cgraph_node::create (fndecl);
539 if (lowered)
540 node->lowered = true;
541 node->definition = true;
542 node->analyze ();
543 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
544 gimple_register_cfg_hooks ();
545 bitmap_obstack_initialize (NULL);
546 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
547 g->get_passes ()->execute_early_local_passes ();
548 bitmap_obstack_release (NULL);
549 pop_cfun ();
550 node->expand ();
551 break;
552
553 default:
554 gcc_unreachable ();
555 }
556
557 /* Set a personality if required and we already passed EH lowering. */
558 if (lowered
559 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
560 == eh_personality_lang))
561 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
562 }
563
564 /* Analyze the function scheduled to be output. */
565 void
566 cgraph_node::analyze (void)
567 {
568 tree decl = this->decl;
569 location_t saved_loc = input_location;
570 input_location = DECL_SOURCE_LOCATION (decl);
571
572 if (thunk.thunk_p)
573 {
574 create_edge (cgraph_node::get (thunk.alias),
575 NULL, 0, CGRAPH_FREQ_BASE);
576 if (!expand_thunk (false, false))
577 {
578 thunk.alias = NULL;
579 return;
580 }
581 thunk.alias = NULL;
582 }
583 if (alias)
584 resolve_alias (cgraph_node::get (alias_target));
585 else if (dispatcher_function)
586 {
587 /* Generate the dispatcher body of multi-versioned functions. */
588 cgraph_function_version_info *dispatcher_version_info
589 = function_version ();
590 if (dispatcher_version_info != NULL
591 && (dispatcher_version_info->dispatcher_resolver
592 == NULL_TREE))
593 {
594 tree resolver = NULL_TREE;
595 gcc_assert (targetm.generate_version_dispatcher_body);
596 resolver = targetm.generate_version_dispatcher_body (this);
597 gcc_assert (resolver != NULL_TREE);
598 }
599 }
600 else
601 {
602 push_cfun (DECL_STRUCT_FUNCTION (decl));
603
604 assign_assembler_name_if_neeeded (decl);
605
606 /* Make sure to gimplify bodies only once. During analyzing a
607 function we lower it, which will require gimplified nested
608 functions, so we can end up here with an already gimplified
609 body. */
610 if (!gimple_has_body_p (decl))
611 gimplify_function_tree (decl);
612 dump_function (TDI_generic, decl);
613
614 /* Lower the function. */
615 if (!lowered)
616 {
617 if (nested)
618 lower_nested_functions (decl);
619 gcc_assert (!nested);
620
621 gimple_register_cfg_hooks ();
622 bitmap_obstack_initialize (NULL);
623 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
624 free_dominance_info (CDI_POST_DOMINATORS);
625 free_dominance_info (CDI_DOMINATORS);
626 compact_blocks ();
627 bitmap_obstack_release (NULL);
628 lowered = true;
629 }
630
631 pop_cfun ();
632 }
633 analyzed = true;
634
635 input_location = saved_loc;
636 }
637
638 /* C++ frontend produce same body aliases all over the place, even before PCH
639 gets streamed out. It relies on us linking the aliases with their function
640 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
641 first produce aliases without links, but once C++ FE is sure he won't sream
642 PCH we build the links via this function. */
643
644 void
645 symbol_table::process_same_body_aliases (void)
646 {
647 symtab_node *node;
648 FOR_EACH_SYMBOL (node)
649 if (node->cpp_implicit_alias && !node->analyzed)
650 node->resolve_alias
651 (TREE_CODE (node->alias_target) == VAR_DECL
652 ? (symtab_node *)varpool_node::get_create (node->alias_target)
653 : (symtab_node *)cgraph_node::get_create (node->alias_target));
654 cpp_implicit_aliases_done = true;
655 }
656
657 /* Process attributes common for vars and functions. */
658
659 static void
660 process_common_attributes (symtab_node *node, tree decl)
661 {
662 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
663
664 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
665 {
666 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
667 "%<weakref%> attribute should be accompanied with"
668 " an %<alias%> attribute");
669 DECL_WEAK (decl) = 0;
670 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
671 DECL_ATTRIBUTES (decl));
672 }
673
674 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
675 node->no_reorder = 1;
676 }
677
678 /* Look for externally_visible and used attributes and mark cgraph nodes
679 accordingly.
680
681 We cannot mark the nodes at the point the attributes are processed (in
682 handle_*_attribute) because the copy of the declarations available at that
683 point may not be canonical. For example, in:
684
685 void f();
686 void f() __attribute__((used));
687
688 the declaration we see in handle_used_attribute will be the second
689 declaration -- but the front end will subsequently merge that declaration
690 with the original declaration and discard the second declaration.
691
692 Furthermore, we can't mark these nodes in finalize_function because:
693
694 void f() {}
695 void f() __attribute__((externally_visible));
696
697 is valid.
698
699 So, we walk the nodes at the end of the translation unit, applying the
700 attributes at that point. */
701
702 static void
703 process_function_and_variable_attributes (cgraph_node *first,
704 varpool_node *first_var)
705 {
706 cgraph_node *node;
707 varpool_node *vnode;
708
709 for (node = symtab->first_function (); node != first;
710 node = symtab->next_function (node))
711 {
712 tree decl = node->decl;
713 if (DECL_PRESERVE_P (decl))
714 node->mark_force_output ();
715 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
716 {
717 if (! TREE_PUBLIC (node->decl))
718 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
719 "%<externally_visible%>"
720 " attribute have effect only on public objects");
721 }
722 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
723 && (node->definition && !node->alias))
724 {
725 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
726 "%<weakref%> attribute ignored"
727 " because function is defined");
728 DECL_WEAK (decl) = 0;
729 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
730 DECL_ATTRIBUTES (decl));
731 }
732
733 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
734 && !DECL_DECLARED_INLINE_P (decl)
735 /* redefining extern inline function makes it DECL_UNINLINABLE. */
736 && !DECL_UNINLINABLE (decl))
737 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
738 "always_inline function might not be inlinable");
739
740 process_common_attributes (node, decl);
741 }
742 for (vnode = symtab->first_variable (); vnode != first_var;
743 vnode = symtab->next_variable (vnode))
744 {
745 tree decl = vnode->decl;
746 if (DECL_EXTERNAL (decl)
747 && DECL_INITIAL (decl))
748 varpool_node::finalize_decl (decl);
749 if (DECL_PRESERVE_P (decl))
750 vnode->force_output = true;
751 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
752 {
753 if (! TREE_PUBLIC (vnode->decl))
754 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
755 "%<externally_visible%>"
756 " attribute have effect only on public objects");
757 }
758 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
759 && vnode->definition
760 && DECL_INITIAL (decl))
761 {
762 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
763 "%<weakref%> attribute ignored"
764 " because variable is initialized");
765 DECL_WEAK (decl) = 0;
766 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
767 DECL_ATTRIBUTES (decl));
768 }
769 process_common_attributes (vnode, decl);
770 }
771 }
772
773 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
774 middle end to output the variable to asm file, if needed or externally
775 visible. */
776
777 void
778 varpool_node::finalize_decl (tree decl)
779 {
780 varpool_node *node = varpool_node::get_create (decl);
781
782 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
783
784 if (node->definition)
785 return;
786 notice_global_symbol (decl);
787 node->definition = true;
788 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
789 /* Traditionally we do not eliminate static variables when not
790 optimizing and when not doing toplevel reoder. */
791 || node->no_reorder
792 || ((!flag_toplevel_reorder
793 && !DECL_COMDAT (node->decl)
794 && !DECL_ARTIFICIAL (node->decl))))
795 node->force_output = true;
796
797 if (symtab->state == CONSTRUCTION
798 && (node->needed_p () || node->referred_to_p ()))
799 enqueue_node (node);
800 if (symtab->state >= IPA_SSA)
801 node->analyze ();
802 /* Some frontends produce various interface variables after compilation
803 finished. */
804 if (symtab->state == FINISHED
805 || (!flag_toplevel_reorder
806 && symtab->state == EXPANSION))
807 node->assemble_decl ();
808
809 if (DECL_INITIAL (decl))
810 chkp_register_var_initializer (decl);
811 }
812
813 /* EDGE is an polymorphic call. Mark all possible targets as reachable
814 and if there is only one target, perform trivial devirtualization.
815 REACHABLE_CALL_TARGETS collects target lists we already walked to
816 avoid udplicate work. */
817
818 static void
819 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
820 cgraph_edge *edge)
821 {
822 unsigned int i;
823 void *cache_token;
824 bool final;
825 vec <cgraph_node *>targets
826 = possible_polymorphic_call_targets
827 (edge, &final, &cache_token);
828
829 if (!reachable_call_targets->add (cache_token))
830 {
831 if (symtab->dump_file)
832 dump_possible_polymorphic_call_targets
833 (symtab->dump_file, edge);
834
835 for (i = 0; i < targets.length (); i++)
836 {
837 /* Do not bother to mark virtual methods in anonymous namespace;
838 either we will find use of virtual table defining it, or it is
839 unused. */
840 if (targets[i]->definition
841 && TREE_CODE
842 (TREE_TYPE (targets[i]->decl))
843 == METHOD_TYPE
844 && !type_in_anonymous_namespace_p
845 (method_class_type
846 (TREE_TYPE (targets[i]->decl))))
847 enqueue_node (targets[i]);
848 }
849 }
850
851 /* Very trivial devirtualization; when the type is
852 final or anonymous (so we know all its derivation)
853 and there is only one possible virtual call target,
854 make the edge direct. */
855 if (final)
856 {
857 if (targets.length () <= 1 && dbg_cnt (devirt))
858 {
859 cgraph_node *target;
860 if (targets.length () == 1)
861 target = targets[0];
862 else
863 target = cgraph_node::create
864 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
865
866 if (symtab->dump_file)
867 {
868 fprintf (symtab->dump_file,
869 "Devirtualizing call: ");
870 print_gimple_stmt (symtab->dump_file,
871 edge->call_stmt, 0,
872 TDF_SLIM);
873 }
874 if (dump_enabled_p ())
875 {
876 location_t locus = gimple_location_safe (edge->call_stmt);
877 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
878 "devirtualizing call in %s to %s\n",
879 edge->caller->name (), target->name ());
880 }
881
882 edge->make_direct (target);
883 edge->redirect_call_stmt_to_callee ();
884
885 /* Call to __builtin_unreachable shouldn't be instrumented. */
886 if (!targets.length ())
887 gimple_call_set_with_bounds (edge->call_stmt, false);
888
889 if (symtab->dump_file)
890 {
891 fprintf (symtab->dump_file,
892 "Devirtualized as: ");
893 print_gimple_stmt (symtab->dump_file,
894 edge->call_stmt, 0,
895 TDF_SLIM);
896 }
897 }
898 }
899 }
900
901
902 /* Discover all functions and variables that are trivially needed, analyze
903 them as well as all functions and variables referred by them */
904 static cgraph_node *first_analyzed;
905 static varpool_node *first_analyzed_var;
906
907 static void
908 analyze_functions (void)
909 {
910 /* Keep track of already processed nodes when called multiple times for
911 intermodule optimization. */
912 cgraph_node *first_handled = first_analyzed;
913 varpool_node *first_handled_var = first_analyzed_var;
914 hash_set<void *> reachable_call_targets;
915
916 symtab_node *node;
917 symtab_node *next;
918 int i;
919 ipa_ref *ref;
920 bool changed = true;
921 location_t saved_loc = input_location;
922
923 bitmap_obstack_initialize (NULL);
924 symtab->state = CONSTRUCTION;
925 input_location = UNKNOWN_LOCATION;
926
927 /* Ugly, but the fixup can not happen at a time same body alias is created;
928 C++ FE is confused about the COMDAT groups being right. */
929 if (symtab->cpp_implicit_aliases_done)
930 FOR_EACH_SYMBOL (node)
931 if (node->cpp_implicit_alias)
932 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
933 if (optimize && flag_devirtualize)
934 build_type_inheritance_graph ();
935
936 /* Analysis adds static variables that in turn adds references to new functions.
937 So we need to iterate the process until it stabilize. */
938 while (changed)
939 {
940 changed = false;
941 process_function_and_variable_attributes (first_analyzed,
942 first_analyzed_var);
943
944 /* First identify the trivially needed symbols. */
945 for (node = symtab->first_symbol ();
946 node != first_analyzed
947 && node != first_analyzed_var; node = node->next)
948 {
949 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
950 node->get_comdat_group_id ();
951 if (node->needed_p ())
952 {
953 enqueue_node (node);
954 if (!changed && symtab->dump_file)
955 fprintf (symtab->dump_file, "Trivially needed symbols:");
956 changed = true;
957 if (symtab->dump_file)
958 fprintf (symtab->dump_file, " %s", node->asm_name ());
959 if (!changed && symtab->dump_file)
960 fprintf (symtab->dump_file, "\n");
961 }
962 if (node == first_analyzed
963 || node == first_analyzed_var)
964 break;
965 }
966 symtab->process_new_functions ();
967 first_analyzed_var = symtab->first_variable ();
968 first_analyzed = symtab->first_function ();
969
970 if (changed && symtab->dump_file)
971 fprintf (symtab->dump_file, "\n");
972
973 /* Lower representation, build callgraph edges and references for all trivially
974 needed symbols and all symbols referred by them. */
975 while (queued_nodes != &symtab_terminator)
976 {
977 changed = true;
978 node = queued_nodes;
979 queued_nodes = (symtab_node *)queued_nodes->aux;
980 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
981 if (cnode && cnode->definition)
982 {
983 cgraph_edge *edge;
984 tree decl = cnode->decl;
985
986 /* ??? It is possible to create extern inline function
987 and later using weak alias attribute to kill its body.
988 See gcc.c-torture/compile/20011119-1.c */
989 if (!DECL_STRUCT_FUNCTION (decl)
990 && !cnode->alias
991 && !cnode->thunk.thunk_p
992 && !cnode->dispatcher_function)
993 {
994 cnode->reset ();
995 cnode->local.redefined_extern_inline = true;
996 continue;
997 }
998
999 if (!cnode->analyzed)
1000 cnode->analyze ();
1001
1002 for (edge = cnode->callees; edge; edge = edge->next_callee)
1003 if (edge->callee->definition)
1004 enqueue_node (edge->callee);
1005 if (optimize && flag_devirtualize)
1006 {
1007 cgraph_edge *next;
1008
1009 for (edge = cnode->indirect_calls; edge; edge = next)
1010 {
1011 next = edge->next_callee;
1012 if (edge->indirect_info->polymorphic)
1013 walk_polymorphic_call_targets (&reachable_call_targets,
1014 edge);
1015 }
1016 }
1017
1018 /* If decl is a clone of an abstract function,
1019 mark that abstract function so that we don't release its body.
1020 The DECL_INITIAL() of that abstract function declaration
1021 will be later needed to output debug info. */
1022 if (DECL_ABSTRACT_ORIGIN (decl))
1023 {
1024 cgraph_node *origin_node
1025 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1026 origin_node->used_as_abstract_origin = true;
1027 }
1028 }
1029 else
1030 {
1031 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1032 if (vnode && vnode->definition && !vnode->analyzed)
1033 vnode->analyze ();
1034 }
1035
1036 if (node->same_comdat_group)
1037 {
1038 symtab_node *next;
1039 for (next = node->same_comdat_group;
1040 next != node;
1041 next = next->same_comdat_group)
1042 enqueue_node (next);
1043 }
1044 for (i = 0; node->iterate_reference (i, ref); i++)
1045 if (ref->referred->definition)
1046 enqueue_node (ref->referred);
1047 symtab->process_new_functions ();
1048 }
1049 }
1050 if (optimize && flag_devirtualize)
1051 update_type_inheritance_graph ();
1052
1053 /* Collect entry points to the unit. */
1054 if (symtab->dump_file)
1055 {
1056 fprintf (symtab->dump_file, "\n\nInitial ");
1057 symtab_node::dump_table (symtab->dump_file);
1058 }
1059
1060 if (symtab->dump_file)
1061 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1062
1063 for (node = symtab->first_symbol ();
1064 node != first_handled
1065 && node != first_handled_var; node = next)
1066 {
1067 next = node->next;
1068 if (!node->aux && !node->referred_to_p ())
1069 {
1070 if (symtab->dump_file)
1071 fprintf (symtab->dump_file, " %s", node->name ());
1072 node->remove ();
1073 continue;
1074 }
1075 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1076 {
1077 tree decl = node->decl;
1078
1079 if (cnode->definition && !gimple_has_body_p (decl)
1080 && !cnode->alias
1081 && !cnode->thunk.thunk_p)
1082 cnode->reset ();
1083
1084 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1085 || cnode->alias
1086 || gimple_has_body_p (decl));
1087 gcc_assert (cnode->analyzed == cnode->definition);
1088 }
1089 node->aux = NULL;
1090 }
1091 for (;node; node = node->next)
1092 node->aux = NULL;
1093 first_analyzed = symtab->first_function ();
1094 first_analyzed_var = symtab->first_variable ();
1095 if (symtab->dump_file)
1096 {
1097 fprintf (symtab->dump_file, "\n\nReclaimed ");
1098 symtab_node::dump_table (symtab->dump_file);
1099 }
1100 bitmap_obstack_release (NULL);
1101 ggc_collect ();
1102 /* Initialize assembler name hash, in particular we want to trigger C++
1103 mangling and same body alias creation before we free DECL_ARGUMENTS
1104 used by it. */
1105 if (!seen_error ())
1106 symtab->symtab_initialize_asm_name_hash ();
1107
1108 input_location = saved_loc;
1109 }
1110
1111 /* Translate the ugly representation of aliases as alias pairs into nice
1112 representation in callgraph. We don't handle all cases yet,
1113 unfortunately. */
1114
1115 static void
1116 handle_alias_pairs (void)
1117 {
1118 alias_pair *p;
1119 unsigned i;
1120
1121 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1122 {
1123 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1124
1125 /* Weakrefs with target not defined in current unit are easy to handle:
1126 they behave just as external variables except we need to note the
1127 alias flag to later output the weakref pseudo op into asm file. */
1128 if (!target_node
1129 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1130 {
1131 symtab_node *node = symtab_node::get (p->decl);
1132 if (node)
1133 {
1134 node->alias_target = p->target;
1135 node->weakref = true;
1136 node->alias = true;
1137 }
1138 alias_pairs->unordered_remove (i);
1139 continue;
1140 }
1141 else if (!target_node)
1142 {
1143 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1144 symtab_node *node = symtab_node::get (p->decl);
1145 if (node)
1146 node->alias = false;
1147 alias_pairs->unordered_remove (i);
1148 continue;
1149 }
1150
1151 if (DECL_EXTERNAL (target_node->decl)
1152 /* We use local aliases for C++ thunks to force the tailcall
1153 to bind locally. This is a hack - to keep it working do
1154 the following (which is not strictly correct). */
1155 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1156 || ! DECL_VIRTUAL_P (target_node->decl))
1157 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1158 {
1159 error ("%q+D aliased to external symbol %qE",
1160 p->decl, p->target);
1161 }
1162
1163 if (TREE_CODE (p->decl) == FUNCTION_DECL
1164 && target_node && is_a <cgraph_node *> (target_node))
1165 {
1166 cgraph_node *src_node = cgraph_node::get (p->decl);
1167 if (src_node && src_node->definition)
1168 src_node->reset ();
1169 cgraph_node::create_alias (p->decl, target_node->decl);
1170 alias_pairs->unordered_remove (i);
1171 }
1172 else if (TREE_CODE (p->decl) == VAR_DECL
1173 && target_node && is_a <varpool_node *> (target_node))
1174 {
1175 varpool_node::create_alias (p->decl, target_node->decl);
1176 alias_pairs->unordered_remove (i);
1177 }
1178 else
1179 {
1180 error ("%q+D alias in between function and variable is not supported",
1181 p->decl);
1182 warning (0, "%q+D aliased declaration",
1183 target_node->decl);
1184 alias_pairs->unordered_remove (i);
1185 }
1186 }
1187 vec_free (alias_pairs);
1188 }
1189
1190
1191 /* Figure out what functions we want to assemble. */
1192
1193 static void
1194 mark_functions_to_output (void)
1195 {
1196 cgraph_node *node;
1197 #ifdef ENABLE_CHECKING
1198 bool check_same_comdat_groups = false;
1199
1200 FOR_EACH_FUNCTION (node)
1201 gcc_assert (!node->process);
1202 #endif
1203
1204 FOR_EACH_FUNCTION (node)
1205 {
1206 tree decl = node->decl;
1207
1208 gcc_assert (!node->process || node->same_comdat_group);
1209 if (node->process)
1210 continue;
1211
1212 /* We need to output all local functions that are used and not
1213 always inlined, as well as those that are reachable from
1214 outside the current compilation unit. */
1215 if (node->analyzed
1216 && !node->thunk.thunk_p
1217 && !node->alias
1218 && !node->global.inlined_to
1219 && !TREE_ASM_WRITTEN (decl)
1220 && !DECL_EXTERNAL (decl))
1221 {
1222 node->process = 1;
1223 if (node->same_comdat_group)
1224 {
1225 cgraph_node *next;
1226 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1227 next != node;
1228 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1229 if (!next->thunk.thunk_p && !next->alias
1230 && !next->comdat_local_p ())
1231 next->process = 1;
1232 }
1233 }
1234 else if (node->same_comdat_group)
1235 {
1236 #ifdef ENABLE_CHECKING
1237 check_same_comdat_groups = true;
1238 #endif
1239 }
1240 else
1241 {
1242 /* We should've reclaimed all functions that are not needed. */
1243 #ifdef ENABLE_CHECKING
1244 if (!node->global.inlined_to
1245 && gimple_has_body_p (decl)
1246 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1247 are inside partition, we can end up not removing the body since we no longer
1248 have analyzed node pointing to it. */
1249 && !node->in_other_partition
1250 && !node->alias
1251 && !node->clones
1252 && !DECL_EXTERNAL (decl))
1253 {
1254 node->debug ();
1255 internal_error ("failed to reclaim unneeded function");
1256 }
1257 #endif
1258 gcc_assert (node->global.inlined_to
1259 || !gimple_has_body_p (decl)
1260 || node->in_other_partition
1261 || node->clones
1262 || DECL_ARTIFICIAL (decl)
1263 || DECL_EXTERNAL (decl));
1264
1265 }
1266
1267 }
1268 #ifdef ENABLE_CHECKING
1269 if (check_same_comdat_groups)
1270 FOR_EACH_FUNCTION (node)
1271 if (node->same_comdat_group && !node->process)
1272 {
1273 tree decl = node->decl;
1274 if (!node->global.inlined_to
1275 && gimple_has_body_p (decl)
1276 /* FIXME: in an ltrans unit when the offline copy is outside a
1277 partition but inline copies are inside a partition, we can
1278 end up not removing the body since we no longer have an
1279 analyzed node pointing to it. */
1280 && !node->in_other_partition
1281 && !node->clones
1282 && !DECL_EXTERNAL (decl))
1283 {
1284 node->debug ();
1285 internal_error ("failed to reclaim unneeded function in same "
1286 "comdat group");
1287 }
1288 }
1289 #endif
1290 }
1291
1292 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1293 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1294
1295 Set current_function_decl and cfun to newly constructed empty function body.
1296 return basic block in the function body. */
1297
1298 basic_block
1299 init_lowered_empty_function (tree decl, bool in_ssa)
1300 {
1301 basic_block bb;
1302
1303 current_function_decl = decl;
1304 allocate_struct_function (decl, false);
1305 gimple_register_cfg_hooks ();
1306 init_empty_tree_cfg ();
1307
1308 if (in_ssa)
1309 {
1310 init_tree_ssa (cfun);
1311 init_ssa_operands (cfun);
1312 cfun->gimple_df->in_ssa_p = true;
1313 cfun->curr_properties |= PROP_ssa;
1314 }
1315
1316 DECL_INITIAL (decl) = make_node (BLOCK);
1317
1318 DECL_SAVED_TREE (decl) = error_mark_node;
1319 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1320 | PROP_cfg | PROP_loops);
1321
1322 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1323 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1324 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1325
1326 /* Create BB for body of the function and connect it properly. */
1327 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1328 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1329 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1330 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1331
1332 return bb;
1333 }
1334
1335 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1336 offset indicated by VIRTUAL_OFFSET, if that is
1337 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1338 zero for a result adjusting thunk. */
1339
1340 static tree
1341 thunk_adjust (gimple_stmt_iterator * bsi,
1342 tree ptr, bool this_adjusting,
1343 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1344 {
1345 gimple stmt;
1346 tree ret;
1347
1348 if (this_adjusting
1349 && fixed_offset != 0)
1350 {
1351 stmt = gimple_build_assign
1352 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1353 ptr,
1354 fixed_offset));
1355 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1356 }
1357
1358 /* If there's a virtual offset, look up that value in the vtable and
1359 adjust the pointer again. */
1360 if (virtual_offset)
1361 {
1362 tree vtabletmp;
1363 tree vtabletmp2;
1364 tree vtabletmp3;
1365
1366 if (!vtable_entry_type)
1367 {
1368 tree vfunc_type = make_node (FUNCTION_TYPE);
1369 TREE_TYPE (vfunc_type) = integer_type_node;
1370 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1371 layout_type (vfunc_type);
1372
1373 vtable_entry_type = build_pointer_type (vfunc_type);
1374 }
1375
1376 vtabletmp =
1377 create_tmp_reg (build_pointer_type
1378 (build_pointer_type (vtable_entry_type)), "vptr");
1379
1380 /* The vptr is always at offset zero in the object. */
1381 stmt = gimple_build_assign (vtabletmp,
1382 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1383 ptr));
1384 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1385
1386 /* Form the vtable address. */
1387 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1388 "vtableaddr");
1389 stmt = gimple_build_assign (vtabletmp2,
1390 build_simple_mem_ref (vtabletmp));
1391 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1392
1393 /* Find the entry with the vcall offset. */
1394 stmt = gimple_build_assign (vtabletmp2,
1395 fold_build_pointer_plus_loc (input_location,
1396 vtabletmp2,
1397 virtual_offset));
1398 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1399
1400 /* Get the offset itself. */
1401 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1402 "vcalloffset");
1403 stmt = gimple_build_assign (vtabletmp3,
1404 build_simple_mem_ref (vtabletmp2));
1405 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1406
1407 /* Adjust the `this' pointer. */
1408 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1409 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1410 GSI_CONTINUE_LINKING);
1411 }
1412
1413 if (!this_adjusting
1414 && fixed_offset != 0)
1415 /* Adjust the pointer by the constant. */
1416 {
1417 tree ptrtmp;
1418
1419 if (TREE_CODE (ptr) == VAR_DECL)
1420 ptrtmp = ptr;
1421 else
1422 {
1423 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1424 stmt = gimple_build_assign (ptrtmp, ptr);
1425 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1426 }
1427 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1428 ptrtmp, fixed_offset);
1429 }
1430
1431 /* Emit the statement and gimplify the adjustment expression. */
1432 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1433 stmt = gimple_build_assign (ret, ptr);
1434 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1435
1436 return ret;
1437 }
1438
1439 /* Expand thunk NODE to gimple if possible.
1440 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1441 no assembler is produced.
1442 When OUTPUT_ASM_THUNK is true, also produce assembler for
1443 thunks that are not lowered. */
1444
1445 bool
1446 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1447 {
1448 bool this_adjusting = thunk.this_adjusting;
1449 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1450 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1451 tree virtual_offset = NULL;
1452 tree alias = callees->callee->decl;
1453 tree thunk_fndecl = decl;
1454 tree a;
1455
1456
1457 if (!force_gimple_thunk && this_adjusting
1458 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1459 virtual_value, alias))
1460 {
1461 const char *fnname;
1462 tree fn_block;
1463 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1464
1465 if (!output_asm_thunks)
1466 {
1467 analyzed = true;
1468 return false;
1469 }
1470
1471 if (in_lto_p)
1472 get_body ();
1473 a = DECL_ARGUMENTS (thunk_fndecl);
1474
1475 current_function_decl = thunk_fndecl;
1476
1477 /* Ensure thunks are emitted in their correct sections. */
1478 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1479
1480 DECL_RESULT (thunk_fndecl)
1481 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1482 RESULT_DECL, 0, restype);
1483 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1484 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1485
1486 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1487 create one. */
1488 fn_block = make_node (BLOCK);
1489 BLOCK_VARS (fn_block) = a;
1490 DECL_INITIAL (thunk_fndecl) = fn_block;
1491 init_function_start (thunk_fndecl);
1492 cfun->is_thunk = 1;
1493 insn_locations_init ();
1494 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1495 prologue_location = curr_insn_location ();
1496 assemble_start_function (thunk_fndecl, fnname);
1497
1498 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1499 fixed_offset, virtual_value, alias);
1500
1501 assemble_end_function (thunk_fndecl, fnname);
1502 insn_locations_finalize ();
1503 init_insn_lengths ();
1504 free_after_compilation (cfun);
1505 set_cfun (NULL);
1506 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1507 thunk.thunk_p = false;
1508 analyzed = false;
1509 }
1510 else
1511 {
1512 tree restype;
1513 basic_block bb, then_bb, else_bb, return_bb;
1514 gimple_stmt_iterator bsi;
1515 int nargs = 0;
1516 tree arg;
1517 int i;
1518 tree resdecl;
1519 tree restmp = NULL;
1520
1521 gimple call;
1522 gimple ret;
1523
1524 if (in_lto_p)
1525 get_body ();
1526 a = DECL_ARGUMENTS (thunk_fndecl);
1527
1528 current_function_decl = thunk_fndecl;
1529
1530 /* Ensure thunks are emitted in their correct sections. */
1531 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1532
1533 DECL_IGNORED_P (thunk_fndecl) = 1;
1534 bitmap_obstack_initialize (NULL);
1535
1536 if (thunk.virtual_offset_p)
1537 virtual_offset = size_int (virtual_value);
1538
1539 /* Build the return declaration for the function. */
1540 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1541 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1542 {
1543 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1544 DECL_ARTIFICIAL (resdecl) = 1;
1545 DECL_IGNORED_P (resdecl) = 1;
1546 DECL_RESULT (thunk_fndecl) = resdecl;
1547 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1548 }
1549 else
1550 resdecl = DECL_RESULT (thunk_fndecl);
1551
1552 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1553
1554 bsi = gsi_start_bb (bb);
1555
1556 /* Build call to the function being thunked. */
1557 if (!VOID_TYPE_P (restype))
1558 {
1559 if (DECL_BY_REFERENCE (resdecl))
1560 {
1561 restmp = gimple_fold_indirect_ref (resdecl);
1562 if (!restmp)
1563 restmp = build2 (MEM_REF,
1564 TREE_TYPE (TREE_TYPE (DECL_RESULT (alias))),
1565 resdecl,
1566 build_int_cst (TREE_TYPE
1567 (DECL_RESULT (alias)), 0));
1568 }
1569 else if (!is_gimple_reg_type (restype))
1570 {
1571 restmp = resdecl;
1572
1573 if (TREE_CODE (restmp) == VAR_DECL)
1574 add_local_decl (cfun, restmp);
1575 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1576 }
1577 else
1578 restmp = create_tmp_reg (restype, "retval");
1579 }
1580
1581 for (arg = a; arg; arg = DECL_CHAIN (arg))
1582 nargs++;
1583 auto_vec<tree> vargs (nargs);
1584 if (this_adjusting)
1585 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1586 virtual_offset));
1587 else if (nargs)
1588 vargs.quick_push (a);
1589
1590 if (nargs)
1591 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1592 {
1593 tree tmp = arg;
1594 if (!is_gimple_val (arg))
1595 {
1596 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1597 (TREE_TYPE (arg)), "arg");
1598 gimple stmt = gimple_build_assign (tmp, arg);
1599 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1600 }
1601 vargs.quick_push (tmp);
1602 }
1603 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1604 callees->call_stmt = call;
1605 gimple_call_set_from_thunk (call, true);
1606 gimple_call_set_with_bounds (call, instrumentation_clone);
1607 if (restmp)
1608 {
1609 gimple_call_set_lhs (call, restmp);
1610 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1611 TREE_TYPE (TREE_TYPE (alias))));
1612 }
1613 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1614 if (!(gimple_call_flags (call) & ECF_NORETURN))
1615 {
1616 if (restmp && !this_adjusting
1617 && (fixed_offset || virtual_offset))
1618 {
1619 tree true_label = NULL_TREE;
1620
1621 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1622 {
1623 gimple stmt;
1624 /* If the return type is a pointer, we need to
1625 protect against NULL. We know there will be an
1626 adjustment, because that's why we're emitting a
1627 thunk. */
1628 then_bb = create_basic_block (NULL, (void *) 0, bb);
1629 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1630 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1631 add_bb_to_loop (then_bb, bb->loop_father);
1632 add_bb_to_loop (return_bb, bb->loop_father);
1633 add_bb_to_loop (else_bb, bb->loop_father);
1634 remove_edge (single_succ_edge (bb));
1635 true_label = gimple_block_label (then_bb);
1636 stmt = gimple_build_cond (NE_EXPR, restmp,
1637 build_zero_cst (TREE_TYPE (restmp)),
1638 NULL_TREE, NULL_TREE);
1639 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1640 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1641 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1642 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1643 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1644 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1645 bsi = gsi_last_bb (then_bb);
1646 }
1647
1648 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1649 fixed_offset, virtual_offset);
1650 if (true_label)
1651 {
1652 gimple stmt;
1653 bsi = gsi_last_bb (else_bb);
1654 stmt = gimple_build_assign (restmp,
1655 build_zero_cst (TREE_TYPE (restmp)));
1656 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1657 bsi = gsi_last_bb (return_bb);
1658 }
1659 }
1660 else
1661 gimple_call_set_tail (call, true);
1662
1663 /* Build return value. */
1664 if (!DECL_BY_REFERENCE (resdecl))
1665 ret = gimple_build_return (restmp);
1666 else
1667 ret = gimple_build_return (resdecl);
1668
1669 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1670 }
1671 else
1672 {
1673 gimple_call_set_tail (call, true);
1674 remove_edge (single_succ_edge (bb));
1675 }
1676
1677 cfun->gimple_df->in_ssa_p = true;
1678 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1679 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1680 delete_unreachable_blocks ();
1681 update_ssa (TODO_update_ssa);
1682 #ifdef ENABLE_CHECKING
1683 verify_flow_info ();
1684 #endif
1685 free_dominance_info (CDI_DOMINATORS);
1686
1687 /* Since we want to emit the thunk, we explicitly mark its name as
1688 referenced. */
1689 thunk.thunk_p = false;
1690 lowered = true;
1691 bitmap_obstack_release (NULL);
1692 }
1693 current_function_decl = NULL;
1694 set_cfun (NULL);
1695 return true;
1696 }
1697
1698 /* Assemble thunks and aliases associated to node. */
1699
1700 void
1701 cgraph_node::assemble_thunks_and_aliases (void)
1702 {
1703 cgraph_edge *e;
1704 ipa_ref *ref;
1705
1706 for (e = callers; e;)
1707 if (e->caller->thunk.thunk_p
1708 && !e->caller->thunk.add_pointer_bounds_args)
1709 {
1710 cgraph_node *thunk = e->caller;
1711
1712 e = e->next_caller;
1713 thunk->expand_thunk (true, false);
1714 thunk->assemble_thunks_and_aliases ();
1715 }
1716 else
1717 e = e->next_caller;
1718
1719 FOR_EACH_ALIAS (this, ref)
1720 {
1721 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1722 bool saved_written = TREE_ASM_WRITTEN (decl);
1723
1724 /* Force assemble_alias to really output the alias this time instead
1725 of buffering it in same alias pairs. */
1726 TREE_ASM_WRITTEN (decl) = 1;
1727 do_assemble_alias (alias->decl,
1728 DECL_ASSEMBLER_NAME (decl));
1729 alias->assemble_thunks_and_aliases ();
1730 TREE_ASM_WRITTEN (decl) = saved_written;
1731 }
1732 }
1733
1734 /* Expand function specified by node. */
1735
1736 void
1737 cgraph_node::expand (void)
1738 {
1739 location_t saved_loc;
1740
1741 /* We ought to not compile any inline clones. */
1742 gcc_assert (!global.inlined_to);
1743
1744 announce_function (decl);
1745 process = 0;
1746 gcc_assert (lowered);
1747 get_body ();
1748
1749 /* Generate RTL for the body of DECL. */
1750
1751 timevar_push (TV_REST_OF_COMPILATION);
1752
1753 gcc_assert (symtab->global_info_ready);
1754
1755 /* Initialize the default bitmap obstack. */
1756 bitmap_obstack_initialize (NULL);
1757
1758 /* Initialize the RTL code for the function. */
1759 current_function_decl = decl;
1760 saved_loc = input_location;
1761 input_location = DECL_SOURCE_LOCATION (decl);
1762 init_function_start (decl);
1763
1764 gimple_register_cfg_hooks ();
1765
1766 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1767
1768 execute_all_ipa_transforms ();
1769
1770 /* Perform all tree transforms and optimizations. */
1771
1772 /* Signal the start of passes. */
1773 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1774
1775 execute_pass_list (cfun, g->get_passes ()->all_passes);
1776
1777 /* Signal the end of passes. */
1778 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1779
1780 bitmap_obstack_release (&reg_obstack);
1781
1782 /* Release the default bitmap obstack. */
1783 bitmap_obstack_release (NULL);
1784
1785 /* If requested, warn about function definitions where the function will
1786 return a value (usually of some struct or union type) which itself will
1787 take up a lot of stack space. */
1788 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1789 {
1790 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1791
1792 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1793 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1794 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1795 larger_than_size))
1796 {
1797 unsigned int size_as_int
1798 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1799
1800 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1801 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1802 decl, size_as_int);
1803 else
1804 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1805 decl, larger_than_size);
1806 }
1807 }
1808
1809 gimple_set_body (decl, NULL);
1810 if (DECL_STRUCT_FUNCTION (decl) == 0
1811 && !cgraph_node::get (decl)->origin)
1812 {
1813 /* Stop pointing to the local nodes about to be freed.
1814 But DECL_INITIAL must remain nonzero so we know this
1815 was an actual function definition.
1816 For a nested function, this is done in c_pop_function_context.
1817 If rest_of_compilation set this to 0, leave it 0. */
1818 if (DECL_INITIAL (decl) != 0)
1819 DECL_INITIAL (decl) = error_mark_node;
1820 }
1821
1822 input_location = saved_loc;
1823
1824 ggc_collect ();
1825 timevar_pop (TV_REST_OF_COMPILATION);
1826
1827 /* Make sure that BE didn't give up on compiling. */
1828 gcc_assert (TREE_ASM_WRITTEN (decl));
1829 set_cfun (NULL);
1830 current_function_decl = NULL;
1831
1832 /* It would make a lot more sense to output thunks before function body to get more
1833 forward and lest backwarding jumps. This however would need solving problem
1834 with comdats. See PR48668. Also aliases must come after function itself to
1835 make one pass assemblers, like one on AIX, happy. See PR 50689.
1836 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1837 groups. */
1838 assemble_thunks_and_aliases ();
1839 release_body ();
1840 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1841 points to the dead function body. */
1842 remove_callees ();
1843 remove_all_references ();
1844 }
1845
1846 /* Node comparer that is responsible for the order that corresponds
1847 to time when a function was launched for the first time. */
1848
1849 static int
1850 node_cmp (const void *pa, const void *pb)
1851 {
1852 const cgraph_node *a = *(const cgraph_node * const *) pa;
1853 const cgraph_node *b = *(const cgraph_node * const *) pb;
1854
1855 /* Functions with time profile must be before these without profile. */
1856 if (!a->tp_first_run || !b->tp_first_run)
1857 return a->tp_first_run - b->tp_first_run;
1858
1859 return a->tp_first_run != b->tp_first_run
1860 ? b->tp_first_run - a->tp_first_run
1861 : b->order - a->order;
1862 }
1863
1864 /* Expand all functions that must be output.
1865
1866 Attempt to topologically sort the nodes so function is output when
1867 all called functions are already assembled to allow data to be
1868 propagated across the callgraph. Use a stack to get smaller distance
1869 between a function and its callees (later we may choose to use a more
1870 sophisticated algorithm for function reordering; we will likely want
1871 to use subsections to make the output functions appear in top-down
1872 order). */
1873
1874 static void
1875 expand_all_functions (void)
1876 {
1877 cgraph_node *node;
1878 cgraph_node **order = XCNEWVEC (cgraph_node *,
1879 symtab->cgraph_count);
1880 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1881 int order_pos, new_order_pos = 0;
1882 int i;
1883
1884 order_pos = ipa_reverse_postorder (order);
1885 gcc_assert (order_pos == symtab->cgraph_count);
1886
1887 /* Garbage collector may remove inline clones we eliminate during
1888 optimization. So we must be sure to not reference them. */
1889 for (i = 0; i < order_pos; i++)
1890 if (order[i]->process)
1891 order[new_order_pos++] = order[i];
1892
1893 if (flag_profile_reorder_functions)
1894 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
1895
1896 for (i = new_order_pos - 1; i >= 0; i--)
1897 {
1898 node = order[i];
1899
1900 if (node->process)
1901 {
1902 expanded_func_count++;
1903 if(node->tp_first_run)
1904 profiled_func_count++;
1905
1906 if (symtab->dump_file)
1907 fprintf (symtab->dump_file,
1908 "Time profile order in expand_all_functions:%s:%d\n",
1909 node->asm_name (), node->tp_first_run);
1910 node->process = 0;
1911 node->expand ();
1912 }
1913 }
1914
1915 if (dump_file)
1916 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1917 main_input_filename, profiled_func_count, expanded_func_count);
1918
1919 if (symtab->dump_file && flag_profile_reorder_functions)
1920 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
1921 profiled_func_count, expanded_func_count);
1922
1923 symtab->process_new_functions ();
1924 free_gimplify_stack ();
1925
1926 free (order);
1927 }
1928
1929 /* This is used to sort the node types by the cgraph order number. */
1930
1931 enum cgraph_order_sort_kind
1932 {
1933 ORDER_UNDEFINED = 0,
1934 ORDER_FUNCTION,
1935 ORDER_VAR,
1936 ORDER_ASM
1937 };
1938
1939 struct cgraph_order_sort
1940 {
1941 enum cgraph_order_sort_kind kind;
1942 union
1943 {
1944 cgraph_node *f;
1945 varpool_node *v;
1946 asm_node *a;
1947 } u;
1948 };
1949
1950 /* Output all functions, variables, and asm statements in the order
1951 according to their order fields, which is the order in which they
1952 appeared in the file. This implements -fno-toplevel-reorder. In
1953 this mode we may output functions and variables which don't really
1954 need to be output.
1955 When NO_REORDER is true only do this for symbols marked no reorder. */
1956
1957 static void
1958 output_in_order (bool no_reorder)
1959 {
1960 int max;
1961 cgraph_order_sort *nodes;
1962 int i;
1963 cgraph_node *pf;
1964 varpool_node *pv;
1965 asm_node *pa;
1966 max = symtab->order;
1967 nodes = XCNEWVEC (cgraph_order_sort, max);
1968
1969 FOR_EACH_DEFINED_FUNCTION (pf)
1970 {
1971 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1972 {
1973 if (no_reorder && !pf->no_reorder)
1974 continue;
1975 i = pf->order;
1976 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1977 nodes[i].kind = ORDER_FUNCTION;
1978 nodes[i].u.f = pf;
1979 }
1980 }
1981
1982 FOR_EACH_DEFINED_VARIABLE (pv)
1983 if (!DECL_EXTERNAL (pv->decl))
1984 {
1985 if (no_reorder && !pv->no_reorder)
1986 continue;
1987 i = pv->order;
1988 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1989 nodes[i].kind = ORDER_VAR;
1990 nodes[i].u.v = pv;
1991 }
1992
1993 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
1994 {
1995 i = pa->order;
1996 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1997 nodes[i].kind = ORDER_ASM;
1998 nodes[i].u.a = pa;
1999 }
2000
2001 /* In toplevel reorder mode we output all statics; mark them as needed. */
2002
2003 for (i = 0; i < max; ++i)
2004 if (nodes[i].kind == ORDER_VAR)
2005 nodes[i].u.v->finalize_named_section_flags ();
2006
2007 for (i = 0; i < max; ++i)
2008 {
2009 switch (nodes[i].kind)
2010 {
2011 case ORDER_FUNCTION:
2012 nodes[i].u.f->process = 0;
2013 nodes[i].u.f->expand ();
2014 break;
2015
2016 case ORDER_VAR:
2017 nodes[i].u.v->assemble_decl ();
2018 break;
2019
2020 case ORDER_ASM:
2021 assemble_asm (nodes[i].u.a->asm_str);
2022 break;
2023
2024 case ORDER_UNDEFINED:
2025 break;
2026
2027 default:
2028 gcc_unreachable ();
2029 }
2030 }
2031
2032 symtab->clear_asm_symbols ();
2033
2034 free (nodes);
2035 }
2036
2037 static void
2038 ipa_passes (void)
2039 {
2040 gcc::pass_manager *passes = g->get_passes ();
2041
2042 set_cfun (NULL);
2043 current_function_decl = NULL;
2044 gimple_register_cfg_hooks ();
2045 bitmap_obstack_initialize (NULL);
2046
2047 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2048
2049 if (!in_lto_p)
2050 {
2051 execute_ipa_pass_list (passes->all_small_ipa_passes);
2052 if (seen_error ())
2053 return;
2054 }
2055
2056 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2057 devirtualization and other changes where removal iterate. */
2058 symtab->remove_unreachable_nodes (true, symtab->dump_file);
2059
2060 /* If pass_all_early_optimizations was not scheduled, the state of
2061 the cgraph will not be properly updated. Update it now. */
2062 if (symtab->state < IPA_SSA)
2063 symtab->state = IPA_SSA;
2064
2065 if (!in_lto_p)
2066 {
2067 /* Generate coverage variables and constructors. */
2068 coverage_finish ();
2069
2070 /* Process new functions added. */
2071 set_cfun (NULL);
2072 current_function_decl = NULL;
2073 symtab->process_new_functions ();
2074
2075 execute_ipa_summary_passes
2076 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2077 }
2078
2079 /* Some targets need to handle LTO assembler output specially. */
2080 if (flag_generate_lto)
2081 targetm.asm_out.lto_start ();
2082
2083 if (!in_lto_p)
2084 {
2085 if (g->have_offload)
2086 {
2087 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2088 ipa_write_summaries (true);
2089 }
2090 if (flag_lto)
2091 {
2092 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2093 ipa_write_summaries (false);
2094 }
2095 }
2096
2097 if (flag_generate_lto)
2098 targetm.asm_out.lto_end ();
2099
2100 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2101 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2102 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2103
2104 bitmap_obstack_release (NULL);
2105 }
2106
2107
2108 /* Return string alias is alias of. */
2109
2110 static tree
2111 get_alias_symbol (tree decl)
2112 {
2113 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2114 return get_identifier (TREE_STRING_POINTER
2115 (TREE_VALUE (TREE_VALUE (alias))));
2116 }
2117
2118
2119 /* Weakrefs may be associated to external decls and thus not output
2120 at expansion time. Emit all necessary aliases. */
2121
2122 void
2123 symbol_table::output_weakrefs (void)
2124 {
2125 symtab_node *node;
2126 cgraph_node *cnode;
2127 FOR_EACH_SYMBOL (node)
2128 if (node->alias
2129 && !TREE_ASM_WRITTEN (node->decl)
2130 && (!(cnode = dyn_cast <cgraph_node *> (node))
2131 || !cnode->instrumented_version
2132 || !TREE_ASM_WRITTEN (cnode->instrumented_version->decl))
2133 && node->weakref)
2134 {
2135 tree target;
2136
2137 /* Weakrefs are special by not requiring target definition in current
2138 compilation unit. It is thus bit hard to work out what we want to
2139 alias.
2140 When alias target is defined, we need to fetch it from symtab reference,
2141 otherwise it is pointed to by alias_target. */
2142 if (node->alias_target)
2143 target = (DECL_P (node->alias_target)
2144 ? DECL_ASSEMBLER_NAME (node->alias_target)
2145 : node->alias_target);
2146 else if (node->analyzed)
2147 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2148 else
2149 {
2150 gcc_unreachable ();
2151 target = get_alias_symbol (node->decl);
2152 }
2153 do_assemble_alias (node->decl, target);
2154 }
2155 }
2156
2157 /* Perform simple optimizations based on callgraph. */
2158
2159 void
2160 symbol_table::compile (void)
2161 {
2162 if (seen_error ())
2163 return;
2164
2165 #ifdef ENABLE_CHECKING
2166 symtab_node::verify_symtab_nodes ();
2167 #endif
2168
2169 timevar_push (TV_CGRAPHOPT);
2170 if (pre_ipa_mem_report)
2171 {
2172 fprintf (stderr, "Memory consumption before IPA\n");
2173 dump_memory_report (false);
2174 }
2175 if (!quiet_flag)
2176 fprintf (stderr, "Performing interprocedural optimizations\n");
2177 state = IPA;
2178
2179 /* Offloading requires LTO infrastructure. */
2180 if (!in_lto_p && g->have_offload)
2181 flag_generate_lto = 1;
2182
2183 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2184 if (flag_generate_lto)
2185 lto_streamer_hooks_init ();
2186
2187 /* Don't run the IPA passes if there was any error or sorry messages. */
2188 if (!seen_error ())
2189 ipa_passes ();
2190
2191 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2192 if (seen_error ()
2193 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2194 {
2195 timevar_pop (TV_CGRAPHOPT);
2196 return;
2197 }
2198
2199 /* This pass remove bodies of extern inline functions we never inlined.
2200 Do this later so other IPA passes see what is really going on.
2201 FIXME: This should be run just after inlining by pasmanager. */
2202 remove_unreachable_nodes (false, dump_file);
2203 global_info_ready = true;
2204 if (dump_file)
2205 {
2206 fprintf (dump_file, "Optimized ");
2207 symtab_node:: dump_table (dump_file);
2208 }
2209 if (post_ipa_mem_report)
2210 {
2211 fprintf (stderr, "Memory consumption after IPA\n");
2212 dump_memory_report (false);
2213 }
2214 timevar_pop (TV_CGRAPHOPT);
2215
2216 /* Output everything. */
2217 (*debug_hooks->assembly_start) ();
2218 if (!quiet_flag)
2219 fprintf (stderr, "Assembling functions:\n");
2220 #ifdef ENABLE_CHECKING
2221 symtab_node::verify_symtab_nodes ();
2222 #endif
2223
2224 materialize_all_clones ();
2225 bitmap_obstack_initialize (NULL);
2226 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2227 bitmap_obstack_release (NULL);
2228 mark_functions_to_output ();
2229
2230 /* When weakref support is missing, we autmatically translate all
2231 references to NODE to references to its ultimate alias target.
2232 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2233 TREE_CHAIN.
2234
2235 Set up this mapping before we output any assembler but once we are sure
2236 that all symbol renaming is done.
2237
2238 FIXME: All this uglyness can go away if we just do renaming at gimple
2239 level by physically rewritting the IL. At the moment we can only redirect
2240 calls, so we need infrastructure for renaming references as well. */
2241 #ifndef ASM_OUTPUT_WEAKREF
2242 symtab_node *node;
2243
2244 FOR_EACH_SYMBOL (node)
2245 if (node->alias
2246 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2247 {
2248 IDENTIFIER_TRANSPARENT_ALIAS
2249 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2250 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2251 = (node->alias_target ? node->alias_target
2252 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2253 }
2254 #endif
2255
2256 state = EXPANSION;
2257
2258 if (!flag_toplevel_reorder)
2259 output_in_order (false);
2260 else
2261 {
2262 /* Output first asm statements and anything ordered. The process
2263 flag is cleared for these nodes, so we skip them later. */
2264 output_in_order (true);
2265 expand_all_functions ();
2266 output_variables ();
2267 }
2268
2269 process_new_functions ();
2270 state = FINISHED;
2271 output_weakrefs ();
2272
2273 if (dump_file)
2274 {
2275 fprintf (dump_file, "\nFinal ");
2276 symtab_node::dump_table (dump_file);
2277 }
2278 #ifdef ENABLE_CHECKING
2279 symtab_node::verify_symtab_nodes ();
2280 /* Double check that all inline clones are gone and that all
2281 function bodies have been released from memory. */
2282 if (!seen_error ())
2283 {
2284 cgraph_node *node;
2285 bool error_found = false;
2286
2287 FOR_EACH_DEFINED_FUNCTION (node)
2288 if (node->global.inlined_to
2289 || gimple_has_body_p (node->decl))
2290 {
2291 error_found = true;
2292 node->debug ();
2293 }
2294 if (error_found)
2295 internal_error ("nodes with unreleased memory found");
2296 }
2297 #endif
2298 }
2299
2300
2301 /* Analyze the whole compilation unit once it is parsed completely. */
2302
2303 void
2304 symbol_table::finalize_compilation_unit (void)
2305 {
2306 timevar_push (TV_CGRAPH);
2307
2308 /* If we're here there's no current function anymore. Some frontends
2309 are lazy in clearing these. */
2310 current_function_decl = NULL;
2311 set_cfun (NULL);
2312
2313 /* Do not skip analyzing the functions if there were errors, we
2314 miss diagnostics for following functions otherwise. */
2315
2316 /* Emit size functions we didn't inline. */
2317 finalize_size_functions ();
2318
2319 /* Mark alias targets necessary and emit diagnostics. */
2320 handle_alias_pairs ();
2321
2322 if (!quiet_flag)
2323 {
2324 fprintf (stderr, "\nAnalyzing compilation unit\n");
2325 fflush (stderr);
2326 }
2327
2328 if (flag_dump_passes)
2329 dump_passes ();
2330
2331 /* Gimplify and lower all functions, compute reachability and
2332 remove unreachable nodes. */
2333 analyze_functions ();
2334
2335 /* Mark alias targets necessary and emit diagnostics. */
2336 handle_alias_pairs ();
2337
2338 /* Gimplify and lower thunks. */
2339 analyze_functions ();
2340
2341 /* Finally drive the pass manager. */
2342 compile ();
2343
2344 timevar_pop (TV_CGRAPH);
2345 }
2346
2347 /* Reset all state within cgraphunit.c so that we can rerun the compiler
2348 within the same process. For use by toplev::finalize. */
2349
2350 void
2351 cgraphunit_c_finalize (void)
2352 {
2353 gcc_assert (cgraph_new_nodes.length () == 0);
2354 cgraph_new_nodes.truncate (0);
2355
2356 vtable_entry_type = NULL;
2357 queued_nodes = &symtab_terminator;
2358
2359 first_analyzed = NULL;
2360 first_analyzed_var = NULL;
2361 }
2362
2363 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2364 kind of wrapper method. */
2365
2366 void
2367 cgraph_node::create_wrapper (cgraph_node *target)
2368 {
2369 /* Preserve DECL_RESULT so we get right by reference flag. */
2370 tree decl_result = DECL_RESULT (decl);
2371
2372 /* Remove the function's body but keep arguments to be reused
2373 for thunk. */
2374 release_body (true);
2375 reset ();
2376
2377 DECL_RESULT (decl) = decl_result;
2378 DECL_INITIAL (decl) = NULL;
2379 allocate_struct_function (decl, false);
2380 set_cfun (NULL);
2381
2382 /* Turn alias into thunk and expand it into GIMPLE representation. */
2383 definition = true;
2384 thunk.thunk_p = true;
2385 thunk.this_adjusting = false;
2386
2387 cgraph_edge *e = create_edge (target, NULL, 0, CGRAPH_FREQ_BASE);
2388
2389 tree arguments = DECL_ARGUMENTS (decl);
2390
2391 while (arguments)
2392 {
2393 TREE_ADDRESSABLE (arguments) = false;
2394 arguments = TREE_CHAIN (arguments);
2395 }
2396
2397 expand_thunk (false, true);
2398 e->call_stmt_cannot_inline_p = true;
2399
2400 /* Inline summary set-up. */
2401 analyze ();
2402 inline_analyze_function (this);
2403 }
2404
2405 #include "gt-cgraphunit.h"