Add an no_reorder attribute for LTO
[gcc.git] / gcc / cgraphunit.c
1 /* Driver of optimization process
2 Copyright (C) 2003-2014 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
4
5 This file is part of GCC.
6
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
11
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
16
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
20
21 /* This module implements main driver of compilation process.
22
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
25
26 The front-end is supposed to use following functionality:
27
28 - finalize_function
29
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
32
33 (There is one exception needed for implementing GCC extern inline
34 function.)
35
36 - varpool_finalize_decl
37
38 This function has same behavior as the above but is used for static
39 variables.
40
41 - add_asm_node
42
43 Insert new toplevel ASM statement
44
45 - finalize_compilation_unit
46
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
49
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
54
55 At the end the bodies of unreachable functions are removed.
56
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
59
60 - compile
61
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
67
68 Compile time:
69
70 1) Inter-procedural optimization.
71 (ipa_passes)
72
73 This part is further split into:
74
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
77
78 The purpose of early optimiations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
82
83 b) early small interprocedural passes.
84
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transational memory lowering,
87 unreachable code removal and other simple transformations.
88
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
91
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
97
98 d) LTO sreaming. When doing LTO, everything important gets
99 streamed into the object file.
100
101 Compile time and or linktime analysis stage (WPA):
102
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multple object files.
111
112 Compile time and/or parallel linktime stage (ltrans)
113
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
117
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
120
121 IP passes can produce copies of existing functoins (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
127
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
130
131 4) late small IP passes
132
133 Simple IP passes working within single program partition.
134
135 5) Expansion
136 (expand_all_functions)
137
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
143
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
146
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
153
154 - cgraph_function_versioning
155
156 produces a copy of function into new one (a version)
157 and apply simple transformations
158 */
159
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "tm.h"
164 #include "tree.h"
165 #include "varasm.h"
166 #include "stor-layout.h"
167 #include "stringpool.h"
168 #include "output.h"
169 #include "rtl.h"
170 #include "basic-block.h"
171 #include "tree-ssa-alias.h"
172 #include "internal-fn.h"
173 #include "gimple-fold.h"
174 #include "gimple-expr.h"
175 #include "is-a.h"
176 #include "gimple.h"
177 #include "gimplify.h"
178 #include "gimple-iterator.h"
179 #include "gimplify-me.h"
180 #include "gimple-ssa.h"
181 #include "tree-cfg.h"
182 #include "tree-into-ssa.h"
183 #include "tree-ssa.h"
184 #include "tree-inline.h"
185 #include "langhooks.h"
186 #include "toplev.h"
187 #include "flags.h"
188 #include "debug.h"
189 #include "target.h"
190 #include "diagnostic.h"
191 #include "params.h"
192 #include "fibheap.h"
193 #include "intl.h"
194 #include "function.h"
195 #include "ipa-prop.h"
196 #include "tree-iterator.h"
197 #include "tree-pass.h"
198 #include "tree-dump.h"
199 #include "gimple-pretty-print.h"
200 #include "output.h"
201 #include "coverage.h"
202 #include "plugin.h"
203 #include "ipa-inline.h"
204 #include "ipa-utils.h"
205 #include "lto-streamer.h"
206 #include "except.h"
207 #include "cfgloop.h"
208 #include "regset.h" /* FIXME: For reg_obstack. */
209 #include "context.h"
210 #include "pass_manager.h"
211 #include "tree-nested.h"
212 #include "gimplify.h"
213 #include "dbgcnt.h"
214
215 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
216 secondary queue used during optimization to accommodate passes that
217 may generate new functions that need to be optimized and expanded. */
218 vec<cgraph_node *> cgraph_new_nodes;
219
220 static void expand_all_functions (void);
221 static void mark_functions_to_output (void);
222 static void handle_alias_pairs (void);
223
224 /* Used for vtable lookup in thunk adjusting. */
225 static GTY (()) tree vtable_entry_type;
226
227 /* Determine if symbol declaration is needed. That is, visible to something
228 either outside this translation unit, something magic in the system
229 configury */
230 bool
231 symtab_node::needed_p (void)
232 {
233 /* Double check that no one output the function into assembly file
234 early. */
235 gcc_checking_assert (!DECL_ASSEMBLER_NAME_SET_P (decl)
236 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
237
238 if (!definition)
239 return false;
240
241 if (DECL_EXTERNAL (decl))
242 return false;
243
244 /* If the user told us it is used, then it must be so. */
245 if (force_output)
246 return true;
247
248 /* ABI forced symbols are needed when they are external. */
249 if (forced_by_abi && TREE_PUBLIC (decl))
250 return true;
251
252 /* Keep constructors, destructors and virtual functions. */
253 if (TREE_CODE (decl) == FUNCTION_DECL
254 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
255 return true;
256
257 /* Externally visible variables must be output. The exception is
258 COMDAT variables that must be output only when they are needed. */
259 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
260 return true;
261
262 return false;
263 }
264
265 /* Head and terminator of the queue of nodes to be processed while building
266 callgraph. */
267
268 static symtab_node symtab_terminator;
269 static symtab_node *queued_nodes = &symtab_terminator;
270
271 /* Add NODE to queue starting at QUEUED_NODES.
272 The queue is linked via AUX pointers and terminated by pointer to 1. */
273
274 static void
275 enqueue_node (symtab_node *node)
276 {
277 if (node->aux)
278 return;
279 gcc_checking_assert (queued_nodes);
280 node->aux = queued_nodes;
281 queued_nodes = node;
282 }
283
284 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
285 functions into callgraph in a way so they look like ordinary reachable
286 functions inserted into callgraph already at construction time. */
287
288 void
289 symbol_table::process_new_functions (void)
290 {
291 tree fndecl;
292
293 if (!cgraph_new_nodes.exists ())
294 return;
295
296 handle_alias_pairs ();
297 /* Note that this queue may grow as its being processed, as the new
298 functions may generate new ones. */
299 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
300 {
301 cgraph_node *node = cgraph_new_nodes[i];
302 fndecl = node->decl;
303 switch (state)
304 {
305 case CONSTRUCTION:
306 /* At construction time we just need to finalize function and move
307 it into reachable functions list. */
308
309 cgraph_node::finalize_function (fndecl, false);
310 call_cgraph_insertion_hooks (node);
311 enqueue_node (node);
312 break;
313
314 case IPA:
315 case IPA_SSA:
316 /* When IPA optimization already started, do all essential
317 transformations that has been already performed on the whole
318 cgraph but not on this function. */
319
320 gimple_register_cfg_hooks ();
321 if (!node->analyzed)
322 node->analyze ();
323 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
324 if (state == IPA_SSA
325 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
326 g->get_passes ()->execute_early_local_passes ();
327 else if (inline_summary_vec != NULL)
328 compute_inline_parameters (node, true);
329 free_dominance_info (CDI_POST_DOMINATORS);
330 free_dominance_info (CDI_DOMINATORS);
331 pop_cfun ();
332 break;
333
334 case EXPANSION:
335 /* Functions created during expansion shall be compiled
336 directly. */
337 node->process = 0;
338 call_cgraph_insertion_hooks (node);
339 node->expand ();
340 break;
341
342 default:
343 gcc_unreachable ();
344 break;
345 }
346 }
347
348 cgraph_new_nodes.release ();
349 }
350
351 /* As an GCC extension we allow redefinition of the function. The
352 semantics when both copies of bodies differ is not well defined.
353 We replace the old body with new body so in unit at a time mode
354 we always use new body, while in normal mode we may end up with
355 old body inlined into some functions and new body expanded and
356 inlined in others.
357
358 ??? It may make more sense to use one body for inlining and other
359 body for expanding the function but this is difficult to do. */
360
361 void
362 cgraph_node::reset (void)
363 {
364 /* If process is set, then we have already begun whole-unit analysis.
365 This is *not* testing for whether we've already emitted the function.
366 That case can be sort-of legitimately seen with real function redefinition
367 errors. I would argue that the front end should never present us with
368 such a case, but don't enforce that for now. */
369 gcc_assert (!process);
370
371 /* Reset our data structures so we can analyze the function again. */
372 memset (&local, 0, sizeof (local));
373 memset (&global, 0, sizeof (global));
374 memset (&rtl, 0, sizeof (rtl));
375 analyzed = false;
376 definition = false;
377 alias = false;
378 weakref = false;
379 cpp_implicit_alias = false;
380
381 remove_callees ();
382 remove_all_references ();
383 }
384
385 /* Return true when there are references to the node. */
386
387 bool
388 symtab_node::referred_to_p (void)
389 {
390 ipa_ref *ref = NULL;
391
392 /* See if there are any references at all. */
393 if (iterate_referring (0, ref))
394 return true;
395 /* For functions check also calls. */
396 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
397 if (cn && cn->callers)
398 return true;
399 return false;
400 }
401
402 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
403 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
404 the garbage collector run at the moment. We would need to either create
405 a new GC context, or just not compile right now. */
406
407 void
408 cgraph_node::finalize_function (tree decl, bool no_collect)
409 {
410 cgraph_node *node = cgraph_node::get_create (decl);
411
412 if (node->definition)
413 {
414 /* Nested functions should only be defined once. */
415 gcc_assert (!DECL_CONTEXT (decl)
416 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
417 node->reset ();
418 node->local.redefined_extern_inline = true;
419 }
420
421 notice_global_symbol (decl);
422 node->definition = true;
423 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
424
425 /* With -fkeep-inline-functions we are keeping all inline functions except
426 for extern inline ones. */
427 if (flag_keep_inline_functions
428 && DECL_DECLARED_INLINE_P (decl)
429 && !DECL_EXTERNAL (decl)
430 && !DECL_DISREGARD_INLINE_LIMITS (decl))
431 node->force_output = 1;
432
433 /* When not optimizing, also output the static functions. (see
434 PR24561), but don't do so for always_inline functions, functions
435 declared inline and nested functions. These were optimized out
436 in the original implementation and it is unclear whether we want
437 to change the behavior here. */
438 if ((!optimize
439 && !node->cpp_implicit_alias
440 && !DECL_DISREGARD_INLINE_LIMITS (decl)
441 && !DECL_DECLARED_INLINE_P (decl)
442 && !(DECL_CONTEXT (decl)
443 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
444 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
445 node->force_output = 1;
446
447 /* If we've not yet emitted decl, tell the debug info about it. */
448 if (!TREE_ASM_WRITTEN (decl))
449 (*debug_hooks->deferred_inline_function) (decl);
450
451 /* Possibly warn about unused parameters. */
452 if (warn_unused_parameter)
453 do_warn_unused_parameter (decl);
454
455 if (!no_collect)
456 ggc_collect ();
457
458 if (symtab->state == CONSTRUCTION
459 && (node->needed_p () || node->referred_to_p ()))
460 enqueue_node (node);
461 }
462
463 /* Add the function FNDECL to the call graph.
464 Unlike finalize_function, this function is intended to be used
465 by middle end and allows insertion of new function at arbitrary point
466 of compilation. The function can be either in high, low or SSA form
467 GIMPLE.
468
469 The function is assumed to be reachable and have address taken (so no
470 API breaking optimizations are performed on it).
471
472 Main work done by this function is to enqueue the function for later
473 processing to avoid need the passes to be re-entrant. */
474
475 void
476 cgraph_node::add_new_function (tree fndecl, bool lowered)
477 {
478 gcc::pass_manager *passes = g->get_passes ();
479 cgraph_node *node;
480 switch (symtab->state)
481 {
482 case PARSING:
483 cgraph_node::finalize_function (fndecl, false);
484 break;
485 case CONSTRUCTION:
486 /* Just enqueue function to be processed at nearest occurrence. */
487 node = cgraph_node::get_create (fndecl);
488 if (lowered)
489 node->lowered = true;
490 cgraph_new_nodes.safe_push (node);
491 break;
492
493 case IPA:
494 case IPA_SSA:
495 case EXPANSION:
496 /* Bring the function into finalized state and enqueue for later
497 analyzing and compilation. */
498 node = cgraph_node::get_create (fndecl);
499 node->local.local = false;
500 node->definition = true;
501 node->force_output = true;
502 if (!lowered && symtab->state == EXPANSION)
503 {
504 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
505 gimple_register_cfg_hooks ();
506 bitmap_obstack_initialize (NULL);
507 execute_pass_list (cfun, passes->all_lowering_passes);
508 passes->execute_early_local_passes ();
509 bitmap_obstack_release (NULL);
510 pop_cfun ();
511
512 lowered = true;
513 }
514 if (lowered)
515 node->lowered = true;
516 cgraph_new_nodes.safe_push (node);
517 break;
518
519 case FINISHED:
520 /* At the very end of compilation we have to do all the work up
521 to expansion. */
522 node = cgraph_node::create (fndecl);
523 if (lowered)
524 node->lowered = true;
525 node->definition = true;
526 node->analyze ();
527 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
528 gimple_register_cfg_hooks ();
529 bitmap_obstack_initialize (NULL);
530 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
531 g->get_passes ()->execute_early_local_passes ();
532 bitmap_obstack_release (NULL);
533 pop_cfun ();
534 node->expand ();
535 break;
536
537 default:
538 gcc_unreachable ();
539 }
540
541 /* Set a personality if required and we already passed EH lowering. */
542 if (lowered
543 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
544 == eh_personality_lang))
545 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
546 }
547
548 /* Analyze the function scheduled to be output. */
549 void
550 cgraph_node::analyze (void)
551 {
552 tree decl = this->decl;
553 location_t saved_loc = input_location;
554 input_location = DECL_SOURCE_LOCATION (decl);
555
556 if (thunk.thunk_p)
557 {
558 create_edge (cgraph_node::get (thunk.alias),
559 NULL, 0, CGRAPH_FREQ_BASE);
560 if (!expand_thunk (false, false))
561 {
562 thunk.alias = NULL;
563 return;
564 }
565 thunk.alias = NULL;
566 }
567 if (alias)
568 resolve_alias (cgraph_node::get (alias_target));
569 else if (dispatcher_function)
570 {
571 /* Generate the dispatcher body of multi-versioned functions. */
572 cgraph_function_version_info *dispatcher_version_info
573 = function_version ();
574 if (dispatcher_version_info != NULL
575 && (dispatcher_version_info->dispatcher_resolver
576 == NULL_TREE))
577 {
578 tree resolver = NULL_TREE;
579 gcc_assert (targetm.generate_version_dispatcher_body);
580 resolver = targetm.generate_version_dispatcher_body (this);
581 gcc_assert (resolver != NULL_TREE);
582 }
583 }
584 else
585 {
586 push_cfun (DECL_STRUCT_FUNCTION (decl));
587
588 assign_assembler_name_if_neeeded (decl);
589
590 /* Make sure to gimplify bodies only once. During analyzing a
591 function we lower it, which will require gimplified nested
592 functions, so we can end up here with an already gimplified
593 body. */
594 if (!gimple_has_body_p (decl))
595 gimplify_function_tree (decl);
596 dump_function (TDI_generic, decl);
597
598 /* Lower the function. */
599 if (!lowered)
600 {
601 if (nested)
602 lower_nested_functions (decl);
603 gcc_assert (!nested);
604
605 gimple_register_cfg_hooks ();
606 bitmap_obstack_initialize (NULL);
607 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
608 free_dominance_info (CDI_POST_DOMINATORS);
609 free_dominance_info (CDI_DOMINATORS);
610 compact_blocks ();
611 bitmap_obstack_release (NULL);
612 lowered = true;
613 }
614
615 pop_cfun ();
616 }
617 analyzed = true;
618
619 input_location = saved_loc;
620 }
621
622 /* C++ frontend produce same body aliases all over the place, even before PCH
623 gets streamed out. It relies on us linking the aliases with their function
624 in order to do the fixups, but ipa-ref is not PCH safe. Consequentely we
625 first produce aliases without links, but once C++ FE is sure he won't sream
626 PCH we build the links via this function. */
627
628 void
629 symbol_table::process_same_body_aliases (void)
630 {
631 symtab_node *node;
632 FOR_EACH_SYMBOL (node)
633 if (node->cpp_implicit_alias && !node->analyzed)
634 node->resolve_alias
635 (TREE_CODE (node->alias_target) == VAR_DECL
636 ? (symtab_node *)varpool_node::get_create (node->alias_target)
637 : (symtab_node *)cgraph_node::get_create (node->alias_target));
638 cpp_implicit_aliases_done = true;
639 }
640
641 /* Process attributes common for vars and functions. */
642
643 static void
644 process_common_attributes (symtab_node *node, tree decl)
645 {
646 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
647
648 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
649 {
650 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
651 "%<weakref%> attribute should be accompanied with"
652 " an %<alias%> attribute");
653 DECL_WEAK (decl) = 0;
654 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
655 DECL_ATTRIBUTES (decl));
656 }
657
658 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
659 node->no_reorder = 1;
660 }
661
662 /* Look for externally_visible and used attributes and mark cgraph nodes
663 accordingly.
664
665 We cannot mark the nodes at the point the attributes are processed (in
666 handle_*_attribute) because the copy of the declarations available at that
667 point may not be canonical. For example, in:
668
669 void f();
670 void f() __attribute__((used));
671
672 the declaration we see in handle_used_attribute will be the second
673 declaration -- but the front end will subsequently merge that declaration
674 with the original declaration and discard the second declaration.
675
676 Furthermore, we can't mark these nodes in finalize_function because:
677
678 void f() {}
679 void f() __attribute__((externally_visible));
680
681 is valid.
682
683 So, we walk the nodes at the end of the translation unit, applying the
684 attributes at that point. */
685
686 static void
687 process_function_and_variable_attributes (cgraph_node *first,
688 varpool_node *first_var)
689 {
690 cgraph_node *node;
691 varpool_node *vnode;
692
693 for (node = symtab->first_function (); node != first;
694 node = symtab->next_function (node))
695 {
696 tree decl = node->decl;
697 if (DECL_PRESERVE_P (decl))
698 node->mark_force_output ();
699 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
700 {
701 if (! TREE_PUBLIC (node->decl))
702 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
703 "%<externally_visible%>"
704 " attribute have effect only on public objects");
705 }
706 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
707 && (node->definition && !node->alias))
708 {
709 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
710 "%<weakref%> attribute ignored"
711 " because function is defined");
712 DECL_WEAK (decl) = 0;
713 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
714 DECL_ATTRIBUTES (decl));
715 }
716
717 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
718 && !DECL_DECLARED_INLINE_P (decl)
719 /* redefining extern inline function makes it DECL_UNINLINABLE. */
720 && !DECL_UNINLINABLE (decl))
721 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
722 "always_inline function might not be inlinable");
723
724 process_common_attributes (node, decl);
725 }
726 for (vnode = symtab->first_variable (); vnode != first_var;
727 vnode = symtab->next_variable (vnode))
728 {
729 tree decl = vnode->decl;
730 if (DECL_EXTERNAL (decl)
731 && DECL_INITIAL (decl))
732 varpool_node::finalize_decl (decl);
733 if (DECL_PRESERVE_P (decl))
734 vnode->force_output = true;
735 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
736 {
737 if (! TREE_PUBLIC (vnode->decl))
738 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
739 "%<externally_visible%>"
740 " attribute have effect only on public objects");
741 }
742 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
743 && vnode->definition
744 && DECL_INITIAL (decl))
745 {
746 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
747 "%<weakref%> attribute ignored"
748 " because variable is initialized");
749 DECL_WEAK (decl) = 0;
750 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
751 DECL_ATTRIBUTES (decl));
752 }
753 process_common_attributes (vnode, decl);
754 }
755 }
756
757 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
758 middle end to output the variable to asm file, if needed or externally
759 visible. */
760
761 void
762 varpool_node::finalize_decl (tree decl)
763 {
764 varpool_node *node = varpool_node::get_create (decl);
765
766 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
767
768 if (node->definition)
769 return;
770 notice_global_symbol (decl);
771 node->definition = true;
772 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
773 /* Traditionally we do not eliminate static variables when not
774 optimizing and when not doing toplevel reoder. */
775 || node->no_reorder
776 || ((!flag_toplevel_reorder
777 && !DECL_COMDAT (node->decl)
778 && !DECL_ARTIFICIAL (node->decl))))
779 node->force_output = true;
780
781 if (symtab->state == CONSTRUCTION
782 && (node->needed_p () || node->referred_to_p ()))
783 enqueue_node (node);
784 if (symtab->state >= IPA_SSA)
785 node->analyze ();
786 /* Some frontends produce various interface variables after compilation
787 finished. */
788 if (symtab->state == FINISHED
789 || (!flag_toplevel_reorder
790 && symtab->state == EXPANSION))
791 node->assemble_decl ();
792 }
793
794 /* EDGE is an polymorphic call. Mark all possible targets as reachable
795 and if there is only one target, perform trivial devirtualization.
796 REACHABLE_CALL_TARGETS collects target lists we already walked to
797 avoid udplicate work. */
798
799 static void
800 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
801 cgraph_edge *edge)
802 {
803 unsigned int i;
804 void *cache_token;
805 bool final;
806 vec <cgraph_node *>targets
807 = possible_polymorphic_call_targets
808 (edge, &final, &cache_token);
809
810 if (!reachable_call_targets->add (cache_token))
811 {
812 if (symtab->dump_file)
813 dump_possible_polymorphic_call_targets
814 (symtab->dump_file, edge);
815
816 for (i = 0; i < targets.length (); i++)
817 {
818 /* Do not bother to mark virtual methods in anonymous namespace;
819 either we will find use of virtual table defining it, or it is
820 unused. */
821 if (targets[i]->definition
822 && TREE_CODE
823 (TREE_TYPE (targets[i]->decl))
824 == METHOD_TYPE
825 && !type_in_anonymous_namespace_p
826 (method_class_type
827 (TREE_TYPE (targets[i]->decl))))
828 enqueue_node (targets[i]);
829 }
830 }
831
832 /* Very trivial devirtualization; when the type is
833 final or anonymous (so we know all its derivation)
834 and there is only one possible virtual call target,
835 make the edge direct. */
836 if (final)
837 {
838 if (targets.length () <= 1 && dbg_cnt (devirt))
839 {
840 cgraph_node *target;
841 if (targets.length () == 1)
842 target = targets[0];
843 else
844 target = cgraph_node::create
845 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
846
847 if (symtab->dump_file)
848 {
849 fprintf (symtab->dump_file,
850 "Devirtualizing call: ");
851 print_gimple_stmt (symtab->dump_file,
852 edge->call_stmt, 0,
853 TDF_SLIM);
854 }
855 if (dump_enabled_p ())
856 {
857 location_t locus = gimple_location_safe (edge->call_stmt);
858 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, locus,
859 "devirtualizing call in %s to %s\n",
860 edge->caller->name (), target->name ());
861 }
862
863 edge->make_direct (target);
864 edge->redirect_call_stmt_to_callee ();
865 if (symtab->dump_file)
866 {
867 fprintf (symtab->dump_file,
868 "Devirtualized as: ");
869 print_gimple_stmt (symtab->dump_file,
870 edge->call_stmt, 0,
871 TDF_SLIM);
872 }
873 }
874 }
875 }
876
877
878 /* Discover all functions and variables that are trivially needed, analyze
879 them as well as all functions and variables referred by them */
880
881 static void
882 analyze_functions (void)
883 {
884 /* Keep track of already processed nodes when called multiple times for
885 intermodule optimization. */
886 static cgraph_node *first_analyzed;
887 cgraph_node *first_handled = first_analyzed;
888 static varpool_node *first_analyzed_var;
889 varpool_node *first_handled_var = first_analyzed_var;
890 hash_set<void *> reachable_call_targets;
891
892 symtab_node *node;
893 symtab_node *next;
894 int i;
895 ipa_ref *ref;
896 bool changed = true;
897 location_t saved_loc = input_location;
898
899 bitmap_obstack_initialize (NULL);
900 symtab->state = CONSTRUCTION;
901 input_location = UNKNOWN_LOCATION;
902
903 /* Ugly, but the fixup can not happen at a time same body alias is created;
904 C++ FE is confused about the COMDAT groups being right. */
905 if (symtab->cpp_implicit_aliases_done)
906 FOR_EACH_SYMBOL (node)
907 if (node->cpp_implicit_alias)
908 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
909 if (optimize && flag_devirtualize)
910 build_type_inheritance_graph ();
911
912 /* Analysis adds static variables that in turn adds references to new functions.
913 So we need to iterate the process until it stabilize. */
914 while (changed)
915 {
916 changed = false;
917 process_function_and_variable_attributes (first_analyzed,
918 first_analyzed_var);
919
920 /* First identify the trivially needed symbols. */
921 for (node = symtab->first_symbol ();
922 node != first_analyzed
923 && node != first_analyzed_var; node = node->next)
924 {
925 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
926 node->get_comdat_group_id ();
927 if (node->needed_p ())
928 {
929 enqueue_node (node);
930 if (!changed && symtab->dump_file)
931 fprintf (symtab->dump_file, "Trivially needed symbols:");
932 changed = true;
933 if (symtab->dump_file)
934 fprintf (symtab->dump_file, " %s", node->asm_name ());
935 if (!changed && symtab->dump_file)
936 fprintf (symtab->dump_file, "\n");
937 }
938 if (node == first_analyzed
939 || node == first_analyzed_var)
940 break;
941 }
942 symtab->process_new_functions ();
943 first_analyzed_var = symtab->first_variable ();
944 first_analyzed = symtab->first_function ();
945
946 if (changed && symtab->dump_file)
947 fprintf (symtab->dump_file, "\n");
948
949 /* Lower representation, build callgraph edges and references for all trivially
950 needed symbols and all symbols referred by them. */
951 while (queued_nodes != &symtab_terminator)
952 {
953 changed = true;
954 node = queued_nodes;
955 queued_nodes = (symtab_node *)queued_nodes->aux;
956 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
957 if (cnode && cnode->definition)
958 {
959 cgraph_edge *edge;
960 tree decl = cnode->decl;
961
962 /* ??? It is possible to create extern inline function
963 and later using weak alias attribute to kill its body.
964 See gcc.c-torture/compile/20011119-1.c */
965 if (!DECL_STRUCT_FUNCTION (decl)
966 && !cnode->alias
967 && !cnode->thunk.thunk_p
968 && !cnode->dispatcher_function)
969 {
970 cnode->reset ();
971 cnode->local.redefined_extern_inline = true;
972 continue;
973 }
974
975 if (!cnode->analyzed)
976 cnode->analyze ();
977
978 for (edge = cnode->callees; edge; edge = edge->next_callee)
979 if (edge->callee->definition)
980 enqueue_node (edge->callee);
981 if (optimize && flag_devirtualize)
982 {
983 cgraph_edge *next;
984
985 for (edge = cnode->indirect_calls; edge; edge = next)
986 {
987 next = edge->next_callee;
988 if (edge->indirect_info->polymorphic)
989 walk_polymorphic_call_targets (&reachable_call_targets,
990 edge);
991 }
992 }
993
994 /* If decl is a clone of an abstract function,
995 mark that abstract function so that we don't release its body.
996 The DECL_INITIAL() of that abstract function declaration
997 will be later needed to output debug info. */
998 if (DECL_ABSTRACT_ORIGIN (decl))
999 {
1000 cgraph_node *origin_node
1001 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1002 origin_node->used_as_abstract_origin = true;
1003 }
1004 }
1005 else
1006 {
1007 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1008 if (vnode && vnode->definition && !vnode->analyzed)
1009 vnode->analyze ();
1010 }
1011
1012 if (node->same_comdat_group)
1013 {
1014 symtab_node *next;
1015 for (next = node->same_comdat_group;
1016 next != node;
1017 next = next->same_comdat_group)
1018 enqueue_node (next);
1019 }
1020 for (i = 0; node->iterate_reference (i, ref); i++)
1021 if (ref->referred->definition)
1022 enqueue_node (ref->referred);
1023 symtab->process_new_functions ();
1024 }
1025 }
1026 if (optimize && flag_devirtualize)
1027 update_type_inheritance_graph ();
1028
1029 /* Collect entry points to the unit. */
1030 if (symtab->dump_file)
1031 {
1032 fprintf (symtab->dump_file, "\n\nInitial ");
1033 symtab_node::dump_table (symtab->dump_file);
1034 }
1035
1036 if (symtab->dump_file)
1037 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1038
1039 for (node = symtab->first_symbol ();
1040 node != first_handled
1041 && node != first_handled_var; node = next)
1042 {
1043 next = node->next;
1044 if (!node->aux && !node->referred_to_p ())
1045 {
1046 if (symtab->dump_file)
1047 fprintf (symtab->dump_file, " %s", node->name ());
1048 node->remove ();
1049 continue;
1050 }
1051 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1052 {
1053 tree decl = node->decl;
1054
1055 if (cnode->definition && !gimple_has_body_p (decl)
1056 && !cnode->alias
1057 && !cnode->thunk.thunk_p)
1058 cnode->reset ();
1059
1060 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1061 || cnode->alias
1062 || gimple_has_body_p (decl));
1063 gcc_assert (cnode->analyzed == cnode->definition);
1064 }
1065 node->aux = NULL;
1066 }
1067 for (;node; node = node->next)
1068 node->aux = NULL;
1069 first_analyzed = symtab->first_function ();
1070 first_analyzed_var = symtab->first_variable ();
1071 if (symtab->dump_file)
1072 {
1073 fprintf (symtab->dump_file, "\n\nReclaimed ");
1074 symtab_node::dump_table (symtab->dump_file);
1075 }
1076 bitmap_obstack_release (NULL);
1077 ggc_collect ();
1078 /* Initialize assembler name hash, in particular we want to trigger C++
1079 mangling and same body alias creation before we free DECL_ARGUMENTS
1080 used by it. */
1081 if (!seen_error ())
1082 symtab->symtab_initialize_asm_name_hash ();
1083
1084 input_location = saved_loc;
1085 }
1086
1087 /* Translate the ugly representation of aliases as alias pairs into nice
1088 representation in callgraph. We don't handle all cases yet,
1089 unfortunately. */
1090
1091 static void
1092 handle_alias_pairs (void)
1093 {
1094 alias_pair *p;
1095 unsigned i;
1096
1097 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1098 {
1099 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1100
1101 /* Weakrefs with target not defined in current unit are easy to handle:
1102 they behave just as external variables except we need to note the
1103 alias flag to later output the weakref pseudo op into asm file. */
1104 if (!target_node
1105 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1106 {
1107 symtab_node *node = symtab_node::get (p->decl);
1108 if (node)
1109 {
1110 node->alias_target = p->target;
1111 node->weakref = true;
1112 node->alias = true;
1113 }
1114 alias_pairs->unordered_remove (i);
1115 continue;
1116 }
1117 else if (!target_node)
1118 {
1119 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1120 symtab_node *node = symtab_node::get (p->decl);
1121 if (node)
1122 node->alias = false;
1123 alias_pairs->unordered_remove (i);
1124 continue;
1125 }
1126
1127 if (DECL_EXTERNAL (target_node->decl)
1128 /* We use local aliases for C++ thunks to force the tailcall
1129 to bind locally. This is a hack - to keep it working do
1130 the following (which is not strictly correct). */
1131 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1132 || ! DECL_VIRTUAL_P (target_node->decl))
1133 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1134 {
1135 error ("%q+D aliased to external symbol %qE",
1136 p->decl, p->target);
1137 }
1138
1139 if (TREE_CODE (p->decl) == FUNCTION_DECL
1140 && target_node && is_a <cgraph_node *> (target_node))
1141 {
1142 cgraph_node *src_node = cgraph_node::get (p->decl);
1143 if (src_node && src_node->definition)
1144 src_node->reset ();
1145 cgraph_node::create_alias (p->decl, target_node->decl);
1146 alias_pairs->unordered_remove (i);
1147 }
1148 else if (TREE_CODE (p->decl) == VAR_DECL
1149 && target_node && is_a <varpool_node *> (target_node))
1150 {
1151 varpool_node::create_alias (p->decl, target_node->decl);
1152 alias_pairs->unordered_remove (i);
1153 }
1154 else
1155 {
1156 error ("%q+D alias in between function and variable is not supported",
1157 p->decl);
1158 warning (0, "%q+D aliased declaration",
1159 target_node->decl);
1160 alias_pairs->unordered_remove (i);
1161 }
1162 }
1163 vec_free (alias_pairs);
1164 }
1165
1166
1167 /* Figure out what functions we want to assemble. */
1168
1169 static void
1170 mark_functions_to_output (void)
1171 {
1172 cgraph_node *node;
1173 #ifdef ENABLE_CHECKING
1174 bool check_same_comdat_groups = false;
1175
1176 FOR_EACH_FUNCTION (node)
1177 gcc_assert (!node->process);
1178 #endif
1179
1180 FOR_EACH_FUNCTION (node)
1181 {
1182 tree decl = node->decl;
1183
1184 gcc_assert (!node->process || node->same_comdat_group);
1185 if (node->process)
1186 continue;
1187
1188 /* We need to output all local functions that are used and not
1189 always inlined, as well as those that are reachable from
1190 outside the current compilation unit. */
1191 if (node->analyzed
1192 && !node->thunk.thunk_p
1193 && !node->alias
1194 && !node->global.inlined_to
1195 && !TREE_ASM_WRITTEN (decl)
1196 && !DECL_EXTERNAL (decl))
1197 {
1198 node->process = 1;
1199 if (node->same_comdat_group)
1200 {
1201 cgraph_node *next;
1202 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1203 next != node;
1204 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1205 if (!next->thunk.thunk_p && !next->alias
1206 && !next->comdat_local_p ())
1207 next->process = 1;
1208 }
1209 }
1210 else if (node->same_comdat_group)
1211 {
1212 #ifdef ENABLE_CHECKING
1213 check_same_comdat_groups = true;
1214 #endif
1215 }
1216 else
1217 {
1218 /* We should've reclaimed all functions that are not needed. */
1219 #ifdef ENABLE_CHECKING
1220 if (!node->global.inlined_to
1221 && gimple_has_body_p (decl)
1222 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1223 are inside partition, we can end up not removing the body since we no longer
1224 have analyzed node pointing to it. */
1225 && !node->in_other_partition
1226 && !node->alias
1227 && !node->clones
1228 && !DECL_EXTERNAL (decl))
1229 {
1230 node->debug ();
1231 internal_error ("failed to reclaim unneeded function");
1232 }
1233 #endif
1234 gcc_assert (node->global.inlined_to
1235 || !gimple_has_body_p (decl)
1236 || node->in_other_partition
1237 || node->clones
1238 || DECL_ARTIFICIAL (decl)
1239 || DECL_EXTERNAL (decl));
1240
1241 }
1242
1243 }
1244 #ifdef ENABLE_CHECKING
1245 if (check_same_comdat_groups)
1246 FOR_EACH_FUNCTION (node)
1247 if (node->same_comdat_group && !node->process)
1248 {
1249 tree decl = node->decl;
1250 if (!node->global.inlined_to
1251 && gimple_has_body_p (decl)
1252 /* FIXME: in an ltrans unit when the offline copy is outside a
1253 partition but inline copies are inside a partition, we can
1254 end up not removing the body since we no longer have an
1255 analyzed node pointing to it. */
1256 && !node->in_other_partition
1257 && !node->clones
1258 && !DECL_EXTERNAL (decl))
1259 {
1260 node->debug ();
1261 internal_error ("failed to reclaim unneeded function in same "
1262 "comdat group");
1263 }
1264 }
1265 #endif
1266 }
1267
1268 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1269 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1270
1271 Set current_function_decl and cfun to newly constructed empty function body.
1272 return basic block in the function body. */
1273
1274 basic_block
1275 init_lowered_empty_function (tree decl, bool in_ssa)
1276 {
1277 basic_block bb;
1278
1279 current_function_decl = decl;
1280 allocate_struct_function (decl, false);
1281 gimple_register_cfg_hooks ();
1282 init_empty_tree_cfg ();
1283
1284 if (in_ssa)
1285 {
1286 init_tree_ssa (cfun);
1287 init_ssa_operands (cfun);
1288 cfun->gimple_df->in_ssa_p = true;
1289 cfun->curr_properties |= PROP_ssa;
1290 }
1291
1292 DECL_INITIAL (decl) = make_node (BLOCK);
1293
1294 DECL_SAVED_TREE (decl) = error_mark_node;
1295 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1296 | PROP_cfg | PROP_loops);
1297
1298 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1299 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1300 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1301
1302 /* Create BB for body of the function and connect it properly. */
1303 bb = create_basic_block (NULL, (void *) 0, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1304 make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1305 make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1306 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1307
1308 return bb;
1309 }
1310
1311 /* Adjust PTR by the constant FIXED_OFFSET, and by the vtable
1312 offset indicated by VIRTUAL_OFFSET, if that is
1313 non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and
1314 zero for a result adjusting thunk. */
1315
1316 static tree
1317 thunk_adjust (gimple_stmt_iterator * bsi,
1318 tree ptr, bool this_adjusting,
1319 HOST_WIDE_INT fixed_offset, tree virtual_offset)
1320 {
1321 gimple stmt;
1322 tree ret;
1323
1324 if (this_adjusting
1325 && fixed_offset != 0)
1326 {
1327 stmt = gimple_build_assign
1328 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1329 ptr,
1330 fixed_offset));
1331 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1332 }
1333
1334 /* If there's a virtual offset, look up that value in the vtable and
1335 adjust the pointer again. */
1336 if (virtual_offset)
1337 {
1338 tree vtabletmp;
1339 tree vtabletmp2;
1340 tree vtabletmp3;
1341
1342 if (!vtable_entry_type)
1343 {
1344 tree vfunc_type = make_node (FUNCTION_TYPE);
1345 TREE_TYPE (vfunc_type) = integer_type_node;
1346 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1347 layout_type (vfunc_type);
1348
1349 vtable_entry_type = build_pointer_type (vfunc_type);
1350 }
1351
1352 vtabletmp =
1353 create_tmp_reg (build_pointer_type
1354 (build_pointer_type (vtable_entry_type)), "vptr");
1355
1356 /* The vptr is always at offset zero in the object. */
1357 stmt = gimple_build_assign (vtabletmp,
1358 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1359 ptr));
1360 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1361
1362 /* Form the vtable address. */
1363 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1364 "vtableaddr");
1365 stmt = gimple_build_assign (vtabletmp2,
1366 build_simple_mem_ref (vtabletmp));
1367 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1368
1369 /* Find the entry with the vcall offset. */
1370 stmt = gimple_build_assign (vtabletmp2,
1371 fold_build_pointer_plus_loc (input_location,
1372 vtabletmp2,
1373 virtual_offset));
1374 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1375
1376 /* Get the offset itself. */
1377 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1378 "vcalloffset");
1379 stmt = gimple_build_assign (vtabletmp3,
1380 build_simple_mem_ref (vtabletmp2));
1381 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1382
1383 /* Adjust the `this' pointer. */
1384 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1385 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1386 GSI_CONTINUE_LINKING);
1387 }
1388
1389 if (!this_adjusting
1390 && fixed_offset != 0)
1391 /* Adjust the pointer by the constant. */
1392 {
1393 tree ptrtmp;
1394
1395 if (TREE_CODE (ptr) == VAR_DECL)
1396 ptrtmp = ptr;
1397 else
1398 {
1399 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1400 stmt = gimple_build_assign (ptrtmp, ptr);
1401 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1402 }
1403 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1404 ptrtmp, fixed_offset);
1405 }
1406
1407 /* Emit the statement and gimplify the adjustment expression. */
1408 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1409 stmt = gimple_build_assign (ret, ptr);
1410 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1411
1412 return ret;
1413 }
1414
1415 /* Expand thunk NODE to gimple if possible.
1416 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1417 no assembler is produced.
1418 When OUTPUT_ASM_THUNK is true, also produce assembler for
1419 thunks that are not lowered. */
1420
1421 bool
1422 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1423 {
1424 bool this_adjusting = thunk.this_adjusting;
1425 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1426 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1427 tree virtual_offset = NULL;
1428 tree alias = callees->callee->decl;
1429 tree thunk_fndecl = decl;
1430 tree a;
1431
1432
1433 if (!force_gimple_thunk && this_adjusting
1434 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1435 virtual_value, alias))
1436 {
1437 const char *fnname;
1438 tree fn_block;
1439 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1440
1441 if (!output_asm_thunks)
1442 {
1443 analyzed = true;
1444 return false;
1445 }
1446
1447 if (in_lto_p)
1448 get_body ();
1449 a = DECL_ARGUMENTS (thunk_fndecl);
1450
1451 current_function_decl = thunk_fndecl;
1452
1453 /* Ensure thunks are emitted in their correct sections. */
1454 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1455
1456 DECL_RESULT (thunk_fndecl)
1457 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1458 RESULT_DECL, 0, restype);
1459 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1460 fnname = IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (thunk_fndecl));
1461
1462 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1463 create one. */
1464 fn_block = make_node (BLOCK);
1465 BLOCK_VARS (fn_block) = a;
1466 DECL_INITIAL (thunk_fndecl) = fn_block;
1467 init_function_start (thunk_fndecl);
1468 cfun->is_thunk = 1;
1469 insn_locations_init ();
1470 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1471 prologue_location = curr_insn_location ();
1472 assemble_start_function (thunk_fndecl, fnname);
1473
1474 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1475 fixed_offset, virtual_value, alias);
1476
1477 assemble_end_function (thunk_fndecl, fnname);
1478 insn_locations_finalize ();
1479 init_insn_lengths ();
1480 free_after_compilation (cfun);
1481 set_cfun (NULL);
1482 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1483 thunk.thunk_p = false;
1484 analyzed = false;
1485 }
1486 else
1487 {
1488 tree restype;
1489 basic_block bb, then_bb, else_bb, return_bb;
1490 gimple_stmt_iterator bsi;
1491 int nargs = 0;
1492 tree arg;
1493 int i;
1494 tree resdecl;
1495 tree restmp = NULL;
1496
1497 gimple call;
1498 gimple ret;
1499
1500 if (in_lto_p)
1501 get_body ();
1502 a = DECL_ARGUMENTS (thunk_fndecl);
1503
1504 current_function_decl = thunk_fndecl;
1505
1506 /* Ensure thunks are emitted in their correct sections. */
1507 resolve_unique_section (thunk_fndecl, 0, flag_function_sections);
1508
1509 DECL_IGNORED_P (thunk_fndecl) = 1;
1510 bitmap_obstack_initialize (NULL);
1511
1512 if (thunk.virtual_offset_p)
1513 virtual_offset = size_int (virtual_value);
1514
1515 /* Build the return declaration for the function. */
1516 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1517 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1518 {
1519 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1520 DECL_ARTIFICIAL (resdecl) = 1;
1521 DECL_IGNORED_P (resdecl) = 1;
1522 DECL_RESULT (thunk_fndecl) = resdecl;
1523 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1524 }
1525 else
1526 resdecl = DECL_RESULT (thunk_fndecl);
1527
1528 bb = then_bb = else_bb = return_bb = init_lowered_empty_function (thunk_fndecl, true);
1529
1530 bsi = gsi_start_bb (bb);
1531
1532 /* Build call to the function being thunked. */
1533 if (!VOID_TYPE_P (restype))
1534 {
1535 if (DECL_BY_REFERENCE (resdecl))
1536 restmp = gimple_fold_indirect_ref (resdecl);
1537 else if (!is_gimple_reg_type (restype))
1538 {
1539 restmp = resdecl;
1540 add_local_decl (cfun, restmp);
1541 BLOCK_VARS (DECL_INITIAL (current_function_decl)) = restmp;
1542 }
1543 else
1544 restmp = create_tmp_reg (restype, "retval");
1545 }
1546
1547 for (arg = a; arg; arg = DECL_CHAIN (arg))
1548 nargs++;
1549 auto_vec<tree> vargs (nargs);
1550 if (this_adjusting)
1551 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
1552 virtual_offset));
1553 else if (nargs)
1554 vargs.quick_push (a);
1555
1556 if (nargs)
1557 for (i = 1, arg = DECL_CHAIN (a); i < nargs; i++, arg = DECL_CHAIN (arg))
1558 {
1559 tree tmp = arg;
1560 if (!is_gimple_val (arg))
1561 {
1562 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
1563 (TREE_TYPE (arg)), "arg");
1564 gimple stmt = gimple_build_assign (tmp, arg);
1565 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1566 }
1567 vargs.quick_push (tmp);
1568 }
1569 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
1570 callees->call_stmt = call;
1571 gimple_call_set_from_thunk (call, true);
1572 if (restmp)
1573 {
1574 gimple_call_set_lhs (call, restmp);
1575 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
1576 TREE_TYPE (TREE_TYPE (alias))));
1577 }
1578 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
1579 if (!(gimple_call_flags (call) & ECF_NORETURN))
1580 {
1581 if (restmp && !this_adjusting
1582 && (fixed_offset || virtual_offset))
1583 {
1584 tree true_label = NULL_TREE;
1585
1586 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
1587 {
1588 gimple stmt;
1589 /* If the return type is a pointer, we need to
1590 protect against NULL. We know there will be an
1591 adjustment, because that's why we're emitting a
1592 thunk. */
1593 then_bb = create_basic_block (NULL, (void *) 0, bb);
1594 return_bb = create_basic_block (NULL, (void *) 0, then_bb);
1595 else_bb = create_basic_block (NULL, (void *) 0, else_bb);
1596 add_bb_to_loop (then_bb, bb->loop_father);
1597 add_bb_to_loop (return_bb, bb->loop_father);
1598 add_bb_to_loop (else_bb, bb->loop_father);
1599 remove_edge (single_succ_edge (bb));
1600 true_label = gimple_block_label (then_bb);
1601 stmt = gimple_build_cond (NE_EXPR, restmp,
1602 build_zero_cst (TREE_TYPE (restmp)),
1603 NULL_TREE, NULL_TREE);
1604 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1605 make_edge (bb, then_bb, EDGE_TRUE_VALUE);
1606 make_edge (bb, else_bb, EDGE_FALSE_VALUE);
1607 make_edge (return_bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1608 make_edge (then_bb, return_bb, EDGE_FALLTHRU);
1609 make_edge (else_bb, return_bb, EDGE_FALLTHRU);
1610 bsi = gsi_last_bb (then_bb);
1611 }
1612
1613 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
1614 fixed_offset, virtual_offset);
1615 if (true_label)
1616 {
1617 gimple stmt;
1618 bsi = gsi_last_bb (else_bb);
1619 stmt = gimple_build_assign (restmp,
1620 build_zero_cst (TREE_TYPE (restmp)));
1621 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
1622 bsi = gsi_last_bb (return_bb);
1623 }
1624 }
1625 else
1626 gimple_call_set_tail (call, true);
1627
1628 /* Build return value. */
1629 ret = gimple_build_return (restmp);
1630 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
1631 }
1632 else
1633 {
1634 gimple_call_set_tail (call, true);
1635 remove_edge (single_succ_edge (bb));
1636 }
1637
1638 cfun->gimple_df->in_ssa_p = true;
1639 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
1640 TREE_ASM_WRITTEN (thunk_fndecl) = false;
1641 delete_unreachable_blocks ();
1642 update_ssa (TODO_update_ssa);
1643 #ifdef ENABLE_CHECKING
1644 verify_flow_info ();
1645 #endif
1646 free_dominance_info (CDI_DOMINATORS);
1647
1648 /* Since we want to emit the thunk, we explicitly mark its name as
1649 referenced. */
1650 thunk.thunk_p = false;
1651 lowered = true;
1652 bitmap_obstack_release (NULL);
1653 }
1654 current_function_decl = NULL;
1655 set_cfun (NULL);
1656 return true;
1657 }
1658
1659 /* Assemble thunks and aliases associated to node. */
1660
1661 void
1662 cgraph_node::assemble_thunks_and_aliases (void)
1663 {
1664 cgraph_edge *e;
1665 ipa_ref *ref;
1666
1667 for (e = callers; e;)
1668 if (e->caller->thunk.thunk_p)
1669 {
1670 cgraph_node *thunk = e->caller;
1671
1672 e = e->next_caller;
1673 thunk->expand_thunk (true, false);
1674 thunk->assemble_thunks_and_aliases ();
1675 }
1676 else
1677 e = e->next_caller;
1678
1679 FOR_EACH_ALIAS (this, ref)
1680 {
1681 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
1682 bool saved_written = TREE_ASM_WRITTEN (decl);
1683
1684 /* Force assemble_alias to really output the alias this time instead
1685 of buffering it in same alias pairs. */
1686 TREE_ASM_WRITTEN (decl) = 1;
1687 do_assemble_alias (alias->decl,
1688 DECL_ASSEMBLER_NAME (decl));
1689 alias->assemble_thunks_and_aliases ();
1690 TREE_ASM_WRITTEN (decl) = saved_written;
1691 }
1692 }
1693
1694 /* Expand function specified by node. */
1695
1696 void
1697 cgraph_node::expand (void)
1698 {
1699 location_t saved_loc;
1700
1701 /* We ought to not compile any inline clones. */
1702 gcc_assert (!global.inlined_to);
1703
1704 announce_function (decl);
1705 process = 0;
1706 gcc_assert (lowered);
1707 get_body ();
1708
1709 /* Generate RTL for the body of DECL. */
1710
1711 timevar_push (TV_REST_OF_COMPILATION);
1712
1713 gcc_assert (symtab->global_info_ready);
1714
1715 /* Initialize the default bitmap obstack. */
1716 bitmap_obstack_initialize (NULL);
1717
1718 /* Initialize the RTL code for the function. */
1719 current_function_decl = decl;
1720 saved_loc = input_location;
1721 input_location = DECL_SOURCE_LOCATION (decl);
1722 init_function_start (decl);
1723
1724 gimple_register_cfg_hooks ();
1725
1726 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
1727
1728 execute_all_ipa_transforms ();
1729
1730 /* Perform all tree transforms and optimizations. */
1731
1732 /* Signal the start of passes. */
1733 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
1734
1735 execute_pass_list (cfun, g->get_passes ()->all_passes);
1736
1737 /* Signal the end of passes. */
1738 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
1739
1740 bitmap_obstack_release (&reg_obstack);
1741
1742 /* Release the default bitmap obstack. */
1743 bitmap_obstack_release (NULL);
1744
1745 /* If requested, warn about function definitions where the function will
1746 return a value (usually of some struct or union type) which itself will
1747 take up a lot of stack space. */
1748 if (warn_larger_than && !DECL_EXTERNAL (decl) && TREE_TYPE (decl))
1749 {
1750 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
1751
1752 if (ret_type && TYPE_SIZE_UNIT (ret_type)
1753 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
1754 && 0 < compare_tree_int (TYPE_SIZE_UNIT (ret_type),
1755 larger_than_size))
1756 {
1757 unsigned int size_as_int
1758 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
1759
1760 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
1761 warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
1762 decl, size_as_int);
1763 else
1764 warning (OPT_Wlarger_than_, "size of return value of %q+D is larger than %wd bytes",
1765 decl, larger_than_size);
1766 }
1767 }
1768
1769 gimple_set_body (decl, NULL);
1770 if (DECL_STRUCT_FUNCTION (decl) == 0
1771 && !cgraph_node::get (decl)->origin)
1772 {
1773 /* Stop pointing to the local nodes about to be freed.
1774 But DECL_INITIAL must remain nonzero so we know this
1775 was an actual function definition.
1776 For a nested function, this is done in c_pop_function_context.
1777 If rest_of_compilation set this to 0, leave it 0. */
1778 if (DECL_INITIAL (decl) != 0)
1779 DECL_INITIAL (decl) = error_mark_node;
1780 }
1781
1782 input_location = saved_loc;
1783
1784 ggc_collect ();
1785 timevar_pop (TV_REST_OF_COMPILATION);
1786
1787 /* Make sure that BE didn't give up on compiling. */
1788 gcc_assert (TREE_ASM_WRITTEN (decl));
1789 set_cfun (NULL);
1790 current_function_decl = NULL;
1791
1792 /* It would make a lot more sense to output thunks before function body to get more
1793 forward and lest backwarding jumps. This however would need solving problem
1794 with comdats. See PR48668. Also aliases must come after function itself to
1795 make one pass assemblers, like one on AIX, happy. See PR 50689.
1796 FIXME: Perhaps thunks should be move before function IFF they are not in comdat
1797 groups. */
1798 assemble_thunks_and_aliases ();
1799 release_body ();
1800 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
1801 points to the dead function body. */
1802 remove_callees ();
1803 remove_all_references ();
1804 }
1805
1806 /* Node comparer that is responsible for the order that corresponds
1807 to time when a function was launched for the first time. */
1808
1809 static int
1810 node_cmp (const void *pa, const void *pb)
1811 {
1812 const cgraph_node *a = *(const cgraph_node * const *) pa;
1813 const cgraph_node *b = *(const cgraph_node * const *) pb;
1814
1815 /* Functions with time profile must be before these without profile. */
1816 if (!a->tp_first_run || !b->tp_first_run)
1817 return a->tp_first_run - b->tp_first_run;
1818
1819 return a->tp_first_run != b->tp_first_run
1820 ? b->tp_first_run - a->tp_first_run
1821 : b->order - a->order;
1822 }
1823
1824 /* Expand all functions that must be output.
1825
1826 Attempt to topologically sort the nodes so function is output when
1827 all called functions are already assembled to allow data to be
1828 propagated across the callgraph. Use a stack to get smaller distance
1829 between a function and its callees (later we may choose to use a more
1830 sophisticated algorithm for function reordering; we will likely want
1831 to use subsections to make the output functions appear in top-down
1832 order). */
1833
1834 static void
1835 expand_all_functions (void)
1836 {
1837 cgraph_node *node;
1838 cgraph_node **order = XCNEWVEC (cgraph_node *,
1839 symtab->cgraph_count);
1840 unsigned int expanded_func_count = 0, profiled_func_count = 0;
1841 int order_pos, new_order_pos = 0;
1842 int i;
1843
1844 order_pos = ipa_reverse_postorder (order);
1845 gcc_assert (order_pos == symtab->cgraph_count);
1846
1847 /* Garbage collector may remove inline clones we eliminate during
1848 optimization. So we must be sure to not reference them. */
1849 for (i = 0; i < order_pos; i++)
1850 if (order[i]->process)
1851 order[new_order_pos++] = order[i];
1852
1853 if (flag_profile_reorder_functions)
1854 qsort (order, new_order_pos, sizeof (cgraph_node *), node_cmp);
1855
1856 for (i = new_order_pos - 1; i >= 0; i--)
1857 {
1858 node = order[i];
1859
1860 if (node->process)
1861 {
1862 expanded_func_count++;
1863 if(node->tp_first_run)
1864 profiled_func_count++;
1865
1866 if (symtab->dump_file)
1867 fprintf (symtab->dump_file,
1868 "Time profile order in expand_all_functions:%s:%d\n",
1869 node->asm_name (), node->tp_first_run);
1870 node->process = 0;
1871 node->expand ();
1872 }
1873 }
1874
1875 if (dump_file)
1876 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
1877 main_input_filename, profiled_func_count, expanded_func_count);
1878
1879 if (symtab->dump_file && flag_profile_reorder_functions)
1880 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
1881 profiled_func_count, expanded_func_count);
1882
1883 symtab->process_new_functions ();
1884 free_gimplify_stack ();
1885
1886 free (order);
1887 }
1888
1889 /* This is used to sort the node types by the cgraph order number. */
1890
1891 enum cgraph_order_sort_kind
1892 {
1893 ORDER_UNDEFINED = 0,
1894 ORDER_FUNCTION,
1895 ORDER_VAR,
1896 ORDER_ASM
1897 };
1898
1899 struct cgraph_order_sort
1900 {
1901 enum cgraph_order_sort_kind kind;
1902 union
1903 {
1904 cgraph_node *f;
1905 varpool_node *v;
1906 asm_node *a;
1907 } u;
1908 };
1909
1910 /* Output all functions, variables, and asm statements in the order
1911 according to their order fields, which is the order in which they
1912 appeared in the file. This implements -fno-toplevel-reorder. In
1913 this mode we may output functions and variables which don't really
1914 need to be output.
1915 When NO_REORDER is true only do this for symbols marked no reorder. */
1916
1917 static void
1918 output_in_order (bool no_reorder)
1919 {
1920 int max;
1921 cgraph_order_sort *nodes;
1922 int i;
1923 cgraph_node *pf;
1924 varpool_node *pv;
1925 asm_node *pa;
1926 max = symtab->order;
1927 nodes = XCNEWVEC (cgraph_order_sort, max);
1928
1929 FOR_EACH_DEFINED_FUNCTION (pf)
1930 {
1931 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
1932 {
1933 if (no_reorder && !pf->no_reorder)
1934 continue;
1935 i = pf->order;
1936 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1937 nodes[i].kind = ORDER_FUNCTION;
1938 nodes[i].u.f = pf;
1939 }
1940 }
1941
1942 FOR_EACH_DEFINED_VARIABLE (pv)
1943 if (!DECL_EXTERNAL (pv->decl))
1944 {
1945 if (no_reorder && !pv->no_reorder)
1946 continue;
1947 i = pv->order;
1948 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1949 nodes[i].kind = ORDER_VAR;
1950 nodes[i].u.v = pv;
1951 }
1952
1953 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
1954 {
1955 i = pa->order;
1956 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1957 nodes[i].kind = ORDER_ASM;
1958 nodes[i].u.a = pa;
1959 }
1960
1961 /* In toplevel reorder mode we output all statics; mark them as needed. */
1962
1963 for (i = 0; i < max; ++i)
1964 if (nodes[i].kind == ORDER_VAR)
1965 nodes[i].u.v->finalize_named_section_flags ();
1966
1967 for (i = 0; i < max; ++i)
1968 {
1969 switch (nodes[i].kind)
1970 {
1971 case ORDER_FUNCTION:
1972 nodes[i].u.f->process = 0;
1973 nodes[i].u.f->expand ();
1974 break;
1975
1976 case ORDER_VAR:
1977 nodes[i].u.v->assemble_decl ();
1978 break;
1979
1980 case ORDER_ASM:
1981 assemble_asm (nodes[i].u.a->asm_str);
1982 break;
1983
1984 case ORDER_UNDEFINED:
1985 break;
1986
1987 default:
1988 gcc_unreachable ();
1989 }
1990 }
1991
1992 symtab->clear_asm_symbols ();
1993
1994 free (nodes);
1995 }
1996
1997 static void
1998 ipa_passes (void)
1999 {
2000 gcc::pass_manager *passes = g->get_passes ();
2001
2002 set_cfun (NULL);
2003 current_function_decl = NULL;
2004 gimple_register_cfg_hooks ();
2005 bitmap_obstack_initialize (NULL);
2006
2007 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2008
2009 if (!in_lto_p)
2010 {
2011 execute_ipa_pass_list (passes->all_small_ipa_passes);
2012 if (seen_error ())
2013 return;
2014 }
2015
2016 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2017 devirtualization and other changes where removal iterate. */
2018 symtab->remove_unreachable_nodes (true, symtab->dump_file);
2019
2020 /* If pass_all_early_optimizations was not scheduled, the state of
2021 the cgraph will not be properly updated. Update it now. */
2022 if (symtab->state < IPA_SSA)
2023 symtab->state = IPA_SSA;
2024
2025 if (!in_lto_p)
2026 {
2027 /* Generate coverage variables and constructors. */
2028 coverage_finish ();
2029
2030 /* Process new functions added. */
2031 set_cfun (NULL);
2032 current_function_decl = NULL;
2033 symtab->process_new_functions ();
2034
2035 execute_ipa_summary_passes
2036 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2037 }
2038
2039 /* Some targets need to handle LTO assembler output specially. */
2040 if (flag_generate_lto)
2041 targetm.asm_out.lto_start ();
2042
2043 if (!in_lto_p)
2044 ipa_write_summaries ();
2045
2046 if (flag_generate_lto)
2047 targetm.asm_out.lto_end ();
2048
2049 if (!flag_ltrans && (in_lto_p || !flag_lto || flag_fat_lto_objects))
2050 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2051 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2052
2053 bitmap_obstack_release (NULL);
2054 }
2055
2056
2057 /* Return string alias is alias of. */
2058
2059 static tree
2060 get_alias_symbol (tree decl)
2061 {
2062 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2063 return get_identifier (TREE_STRING_POINTER
2064 (TREE_VALUE (TREE_VALUE (alias))));
2065 }
2066
2067
2068 /* Weakrefs may be associated to external decls and thus not output
2069 at expansion time. Emit all necessary aliases. */
2070
2071 void
2072 symbol_table::output_weakrefs (void)
2073 {
2074 symtab_node *node;
2075 FOR_EACH_SYMBOL (node)
2076 if (node->alias
2077 && !TREE_ASM_WRITTEN (node->decl)
2078 && node->weakref)
2079 {
2080 tree target;
2081
2082 /* Weakrefs are special by not requiring target definition in current
2083 compilation unit. It is thus bit hard to work out what we want to
2084 alias.
2085 When alias target is defined, we need to fetch it from symtab reference,
2086 otherwise it is pointed to by alias_target. */
2087 if (node->alias_target)
2088 target = (DECL_P (node->alias_target)
2089 ? DECL_ASSEMBLER_NAME (node->alias_target)
2090 : node->alias_target);
2091 else if (node->analyzed)
2092 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2093 else
2094 {
2095 gcc_unreachable ();
2096 target = get_alias_symbol (node->decl);
2097 }
2098 do_assemble_alias (node->decl, target);
2099 }
2100 }
2101
2102 /* Perform simple optimizations based on callgraph. */
2103
2104 void
2105 symbol_table::compile (void)
2106 {
2107 if (seen_error ())
2108 return;
2109
2110 #ifdef ENABLE_CHECKING
2111 symtab_node::verify_symtab_nodes ();
2112 #endif
2113
2114 timevar_push (TV_CGRAPHOPT);
2115 if (pre_ipa_mem_report)
2116 {
2117 fprintf (stderr, "Memory consumption before IPA\n");
2118 dump_memory_report (false);
2119 }
2120 if (!quiet_flag)
2121 fprintf (stderr, "Performing interprocedural optimizations\n");
2122 state = IPA;
2123
2124 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2125 if (flag_lto)
2126 lto_streamer_hooks_init ();
2127
2128 /* Don't run the IPA passes if there was any error or sorry messages. */
2129 if (!seen_error ())
2130 ipa_passes ();
2131
2132 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2133 if (seen_error ()
2134 || (!in_lto_p && flag_lto && !flag_fat_lto_objects))
2135 {
2136 timevar_pop (TV_CGRAPHOPT);
2137 return;
2138 }
2139
2140 /* This pass remove bodies of extern inline functions we never inlined.
2141 Do this later so other IPA passes see what is really going on.
2142 FIXME: This should be run just after inlining by pasmanager. */
2143 remove_unreachable_nodes (false, dump_file);
2144 global_info_ready = true;
2145 if (dump_file)
2146 {
2147 fprintf (dump_file, "Optimized ");
2148 symtab_node:: dump_table (dump_file);
2149 }
2150 if (post_ipa_mem_report)
2151 {
2152 fprintf (stderr, "Memory consumption after IPA\n");
2153 dump_memory_report (false);
2154 }
2155 timevar_pop (TV_CGRAPHOPT);
2156
2157 /* Output everything. */
2158 (*debug_hooks->assembly_start) ();
2159 if (!quiet_flag)
2160 fprintf (stderr, "Assembling functions:\n");
2161 #ifdef ENABLE_CHECKING
2162 symtab_node::verify_symtab_nodes ();
2163 #endif
2164
2165 materialize_all_clones ();
2166 bitmap_obstack_initialize (NULL);
2167 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2168 bitmap_obstack_release (NULL);
2169 mark_functions_to_output ();
2170
2171 /* When weakref support is missing, we autmatically translate all
2172 references to NODE to references to its ultimate alias target.
2173 The renaming mechanizm uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2174 TREE_CHAIN.
2175
2176 Set up this mapping before we output any assembler but once we are sure
2177 that all symbol renaming is done.
2178
2179 FIXME: All this uglyness can go away if we just do renaming at gimple
2180 level by physically rewritting the IL. At the moment we can only redirect
2181 calls, so we need infrastructure for renaming references as well. */
2182 #ifndef ASM_OUTPUT_WEAKREF
2183 symtab_node *node;
2184
2185 FOR_EACH_SYMBOL (node)
2186 if (node->alias
2187 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2188 {
2189 IDENTIFIER_TRANSPARENT_ALIAS
2190 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2191 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2192 = (node->alias_target ? node->alias_target
2193 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2194 }
2195 #endif
2196
2197 state = EXPANSION;
2198
2199 if (!flag_toplevel_reorder)
2200 output_in_order (false);
2201 else
2202 {
2203 /* Output first asm statements and anything ordered. The process
2204 flag is cleared for these nodes, so we skip them later. */
2205 output_in_order (true);
2206 expand_all_functions ();
2207 output_variables ();
2208 }
2209
2210 process_new_functions ();
2211 state = FINISHED;
2212 output_weakrefs ();
2213
2214 if (dump_file)
2215 {
2216 fprintf (dump_file, "\nFinal ");
2217 symtab_node::dump_table (dump_file);
2218 }
2219 #ifdef ENABLE_CHECKING
2220 symtab_node::verify_symtab_nodes ();
2221 /* Double check that all inline clones are gone and that all
2222 function bodies have been released from memory. */
2223 if (!seen_error ())
2224 {
2225 cgraph_node *node;
2226 bool error_found = false;
2227
2228 FOR_EACH_DEFINED_FUNCTION (node)
2229 if (node->global.inlined_to
2230 || gimple_has_body_p (node->decl))
2231 {
2232 error_found = true;
2233 node->debug ();
2234 }
2235 if (error_found)
2236 internal_error ("nodes with unreleased memory found");
2237 }
2238 #endif
2239 }
2240
2241
2242 /* Analyze the whole compilation unit once it is parsed completely. */
2243
2244 void
2245 symbol_table::finalize_compilation_unit (void)
2246 {
2247 timevar_push (TV_CGRAPH);
2248
2249 /* If we're here there's no current function anymore. Some frontends
2250 are lazy in clearing these. */
2251 current_function_decl = NULL;
2252 set_cfun (NULL);
2253
2254 /* Do not skip analyzing the functions if there were errors, we
2255 miss diagnostics for following functions otherwise. */
2256
2257 /* Emit size functions we didn't inline. */
2258 finalize_size_functions ();
2259
2260 /* Mark alias targets necessary and emit diagnostics. */
2261 handle_alias_pairs ();
2262
2263 if (!quiet_flag)
2264 {
2265 fprintf (stderr, "\nAnalyzing compilation unit\n");
2266 fflush (stderr);
2267 }
2268
2269 if (flag_dump_passes)
2270 dump_passes ();
2271
2272 /* Gimplify and lower all functions, compute reachability and
2273 remove unreachable nodes. */
2274 analyze_functions ();
2275
2276 /* Mark alias targets necessary and emit diagnostics. */
2277 handle_alias_pairs ();
2278
2279 /* Gimplify and lower thunks. */
2280 analyze_functions ();
2281
2282 /* Finally drive the pass manager. */
2283 compile ();
2284
2285 timevar_pop (TV_CGRAPH);
2286 }
2287
2288 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
2289 kind of wrapper method. */
2290
2291 void
2292 cgraph_node::create_wrapper (cgraph_node *target)
2293 {
2294 /* Preserve DECL_RESULT so we get right by reference flag. */
2295 tree decl_result = DECL_RESULT (decl);
2296
2297 /* Remove the function's body. */
2298 release_body ();
2299 reset ();
2300
2301 DECL_RESULT (decl) = decl_result;
2302 DECL_INITIAL (decl) = NULL;
2303 allocate_struct_function (decl, false);
2304 set_cfun (NULL);
2305
2306 /* Turn alias into thunk and expand it into GIMPLE representation. */
2307 definition = true;
2308 thunk.thunk_p = true;
2309 thunk.this_adjusting = false;
2310
2311 cgraph_edge *e = create_edge (target, NULL, 0, CGRAPH_FREQ_BASE);
2312
2313 expand_thunk (false, true);
2314 e->call_stmt_cannot_inline_p = true;
2315
2316 /* Inline summary set-up. */
2317 analyze ();
2318 inline_analyze_function (this);
2319 }
2320
2321 #include "gt-cgraphunit.h"