]> git.ipfire.org Git - thirdparty/gcc.git/blob - gcc/cgraphunit.c
cgraphunit: Avoid code generation differences based on -w/TREE_NO_WARNING [PR94277]
[thirdparty/gcc.git] / gcc / cgraphunit.c
1 /* Driver of optimization process
2 Copyright (C) 2003-2020 Free Software Foundation, Inc.
3 Contributed by Jan Hubicka
4
5 This file is part of GCC.
6
7 GCC is free software; you can redistribute it and/or modify it under
8 the terms of the GNU General Public License as published by the Free
9 Software Foundation; either version 3, or (at your option) any later
10 version.
11
12 GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13 WARRANTY; without even the implied warranty of MERCHANTABILITY or
14 FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
15 for more details.
16
17 You should have received a copy of the GNU General Public License
18 along with GCC; see the file COPYING3. If not see
19 <http://www.gnu.org/licenses/>. */
20
21 /* This module implements main driver of compilation process.
22
23 The main scope of this file is to act as an interface in between
24 tree based frontends and the backend.
25
26 The front-end is supposed to use following functionality:
27
28 - finalize_function
29
30 This function is called once front-end has parsed whole body of function
31 and it is certain that the function body nor the declaration will change.
32
33 (There is one exception needed for implementing GCC extern inline
34 function.)
35
36 - varpool_finalize_decl
37
38 This function has same behavior as the above but is used for static
39 variables.
40
41 - add_asm_node
42
43 Insert new toplevel ASM statement
44
45 - finalize_compilation_unit
46
47 This function is called once (source level) compilation unit is finalized
48 and it will no longer change.
49
50 The symbol table is constructed starting from the trivially needed
51 symbols finalized by the frontend. Functions are lowered into
52 GIMPLE representation and callgraph/reference lists are constructed.
53 Those are used to discover other necessary functions and variables.
54
55 At the end the bodies of unreachable functions are removed.
56
57 The function can be called multiple times when multiple source level
58 compilation units are combined.
59
60 - compile
61
62 This passes control to the back-end. Optimizations are performed and
63 final assembler is generated. This is done in the following way. Note
64 that with link time optimization the process is split into three
65 stages (compile time, linktime analysis and parallel linktime as
66 indicated bellow).
67
68 Compile time:
69
70 1) Inter-procedural optimization.
71 (ipa_passes)
72
73 This part is further split into:
74
75 a) early optimizations. These are local passes executed in
76 the topological order on the callgraph.
77
78 The purpose of early optimizations is to optimize away simple
79 things that may otherwise confuse IP analysis. Very simple
80 propagation across the callgraph is done i.e. to discover
81 functions without side effects and simple inlining is performed.
82
83 b) early small interprocedural passes.
84
85 Those are interprocedural passes executed only at compilation
86 time. These include, for example, transactional memory lowering,
87 unreachable code removal and other simple transformations.
88
89 c) IP analysis stage. All interprocedural passes do their
90 analysis.
91
92 Interprocedural passes differ from small interprocedural
93 passes by their ability to operate across whole program
94 at linktime. Their analysis stage is performed early to
95 both reduce linking times and linktime memory usage by
96 not having to represent whole program in memory.
97
98 d) LTO streaming. When doing LTO, everything important gets
99 streamed into the object file.
100
101 Compile time and or linktime analysis stage (WPA):
102
103 At linktime units gets streamed back and symbol table is
104 merged. Function bodies are not streamed in and not
105 available.
106 e) IP propagation stage. All IP passes execute their
107 IP propagation. This is done based on the earlier analysis
108 without having function bodies at hand.
109 f) Ltrans streaming. When doing WHOPR LTO, the program
110 is partitioned and streamed into multiple object files.
111
112 Compile time and/or parallel linktime stage (ltrans)
113
114 Each of the object files is streamed back and compiled
115 separately. Now the function bodies becomes available
116 again.
117
118 2) Virtual clone materialization
119 (cgraph_materialize_clone)
120
121 IP passes can produce copies of existing functions (such
122 as versioned clones or inline clones) without actually
123 manipulating their bodies by creating virtual clones in
124 the callgraph. At this time the virtual clones are
125 turned into real functions
126 3) IP transformation
127
128 All IP passes transform function bodies based on earlier
129 decision of the IP propagation.
130
131 4) late small IP passes
132
133 Simple IP passes working within single program partition.
134
135 5) Expansion
136 (expand_all_functions)
137
138 At this stage functions that needs to be output into
139 assembler are identified and compiled in topological order
140 6) Output of variables and aliases
141 Now it is known what variable references was not optimized
142 out and thus all variables are output to the file.
143
144 Note that with -fno-toplevel-reorder passes 5 and 6
145 are combined together in cgraph_output_in_order.
146
147 Finally there are functions to manipulate the callgraph from
148 backend.
149 - cgraph_add_new_function is used to add backend produced
150 functions introduced after the unit is finalized.
151 The functions are enqueue for later processing and inserted
152 into callgraph with cgraph_process_new_functions.
153
154 - cgraph_function_versioning
155
156 produces a copy of function into new one (a version)
157 and apply simple transformations
158 */
159
160 #include "config.h"
161 #include "system.h"
162 #include "coretypes.h"
163 #include "backend.h"
164 #include "target.h"
165 #include "rtl.h"
166 #include "tree.h"
167 #include "gimple.h"
168 #include "cfghooks.h"
169 #include "regset.h" /* FIXME: For reg_obstack. */
170 #include "alloc-pool.h"
171 #include "tree-pass.h"
172 #include "stringpool.h"
173 #include "gimple-ssa.h"
174 #include "cgraph.h"
175 #include "coverage.h"
176 #include "lto-streamer.h"
177 #include "fold-const.h"
178 #include "varasm.h"
179 #include "stor-layout.h"
180 #include "output.h"
181 #include "cfgcleanup.h"
182 #include "gimple-fold.h"
183 #include "gimplify.h"
184 #include "gimple-iterator.h"
185 #include "gimplify-me.h"
186 #include "tree-cfg.h"
187 #include "tree-into-ssa.h"
188 #include "tree-ssa.h"
189 #include "langhooks.h"
190 #include "toplev.h"
191 #include "debug.h"
192 #include "symbol-summary.h"
193 #include "tree-vrp.h"
194 #include "ipa-prop.h"
195 #include "gimple-pretty-print.h"
196 #include "plugin.h"
197 #include "ipa-fnsummary.h"
198 #include "ipa-utils.h"
199 #include "except.h"
200 #include "cfgloop.h"
201 #include "context.h"
202 #include "pass_manager.h"
203 #include "tree-nested.h"
204 #include "dbgcnt.h"
205 #include "lto-section-names.h"
206 #include "stringpool.h"
207 #include "attribs.h"
208
209 /* Queue of cgraph nodes scheduled to be added into cgraph. This is a
210 secondary queue used during optimization to accommodate passes that
211 may generate new functions that need to be optimized and expanded. */
212 vec<cgraph_node *> cgraph_new_nodes;
213
214 static void expand_all_functions (void);
215 static void mark_functions_to_output (void);
216 static void handle_alias_pairs (void);
217
218 /* Used for vtable lookup in thunk adjusting. */
219 static GTY (()) tree vtable_entry_type;
220
221 /* Return true if this symbol is a function from the C frontend specified
222 directly in RTL form (with "__RTL"). */
223
224 bool
225 symtab_node::native_rtl_p () const
226 {
227 if (TREE_CODE (decl) != FUNCTION_DECL)
228 return false;
229 if (!DECL_STRUCT_FUNCTION (decl))
230 return false;
231 return DECL_STRUCT_FUNCTION (decl)->curr_properties & PROP_rtl;
232 }
233
234 /* Determine if symbol declaration is needed. That is, visible to something
235 either outside this translation unit, something magic in the system
236 configury */
237 bool
238 symtab_node::needed_p (void)
239 {
240 /* Double check that no one output the function into assembly file
241 early. */
242 if (!native_rtl_p ())
243 gcc_checking_assert
244 (!DECL_ASSEMBLER_NAME_SET_P (decl)
245 || !TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)));
246
247 if (!definition)
248 return false;
249
250 if (DECL_EXTERNAL (decl))
251 return false;
252
253 /* If the user told us it is used, then it must be so. */
254 if (force_output)
255 return true;
256
257 /* ABI forced symbols are needed when they are external. */
258 if (forced_by_abi && TREE_PUBLIC (decl))
259 return true;
260
261 /* Keep constructors, destructors and virtual functions. */
262 if (TREE_CODE (decl) == FUNCTION_DECL
263 && (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl)))
264 return true;
265
266 /* Externally visible variables must be output. The exception is
267 COMDAT variables that must be output only when they are needed. */
268 if (TREE_PUBLIC (decl) && !DECL_COMDAT (decl))
269 return true;
270
271 return false;
272 }
273
274 /* Head and terminator of the queue of nodes to be processed while building
275 callgraph. */
276
277 static symtab_node symtab_terminator (SYMTAB_SYMBOL);
278 static symtab_node *queued_nodes = &symtab_terminator;
279
280 /* Add NODE to queue starting at QUEUED_NODES.
281 The queue is linked via AUX pointers and terminated by pointer to 1. */
282
283 static void
284 enqueue_node (symtab_node *node)
285 {
286 if (node->aux)
287 return;
288 gcc_checking_assert (queued_nodes);
289 node->aux = queued_nodes;
290 queued_nodes = node;
291 }
292
293 /* Process CGRAPH_NEW_FUNCTIONS and perform actions necessary to add these
294 functions into callgraph in a way so they look like ordinary reachable
295 functions inserted into callgraph already at construction time. */
296
297 void
298 symbol_table::process_new_functions (void)
299 {
300 tree fndecl;
301
302 if (!cgraph_new_nodes.exists ())
303 return;
304
305 handle_alias_pairs ();
306 /* Note that this queue may grow as its being processed, as the new
307 functions may generate new ones. */
308 for (unsigned i = 0; i < cgraph_new_nodes.length (); i++)
309 {
310 cgraph_node *node = cgraph_new_nodes[i];
311 fndecl = node->decl;
312 switch (state)
313 {
314 case CONSTRUCTION:
315 /* At construction time we just need to finalize function and move
316 it into reachable functions list. */
317
318 cgraph_node::finalize_function (fndecl, false);
319 call_cgraph_insertion_hooks (node);
320 enqueue_node (node);
321 break;
322
323 case IPA:
324 case IPA_SSA:
325 case IPA_SSA_AFTER_INLINING:
326 /* When IPA optimization already started, do all essential
327 transformations that has been already performed on the whole
328 cgraph but not on this function. */
329
330 gimple_register_cfg_hooks ();
331 if (!node->analyzed)
332 node->analyze ();
333 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
334 if ((state == IPA_SSA || state == IPA_SSA_AFTER_INLINING)
335 && !gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
336 {
337 bool summaried_computed = ipa_fn_summaries != NULL;
338 g->get_passes ()->execute_early_local_passes ();
339 /* Early passes compute inline parameters to do inlining
340 and splitting. This is redundant for functions added late.
341 Just throw away whatever it did. */
342 if (!summaried_computed)
343 {
344 ipa_free_fn_summary ();
345 ipa_free_size_summary ();
346 }
347 }
348 else if (ipa_fn_summaries != NULL)
349 compute_fn_summary (node, true);
350 free_dominance_info (CDI_POST_DOMINATORS);
351 free_dominance_info (CDI_DOMINATORS);
352 pop_cfun ();
353 call_cgraph_insertion_hooks (node);
354 break;
355
356 case EXPANSION:
357 /* Functions created during expansion shall be compiled
358 directly. */
359 node->process = 0;
360 call_cgraph_insertion_hooks (node);
361 node->expand ();
362 break;
363
364 default:
365 gcc_unreachable ();
366 break;
367 }
368 }
369
370 cgraph_new_nodes.release ();
371 }
372
373 /* As an GCC extension we allow redefinition of the function. The
374 semantics when both copies of bodies differ is not well defined.
375 We replace the old body with new body so in unit at a time mode
376 we always use new body, while in normal mode we may end up with
377 old body inlined into some functions and new body expanded and
378 inlined in others.
379
380 ??? It may make more sense to use one body for inlining and other
381 body for expanding the function but this is difficult to do. */
382
383 void
384 cgraph_node::reset (void)
385 {
386 /* If process is set, then we have already begun whole-unit analysis.
387 This is *not* testing for whether we've already emitted the function.
388 That case can be sort-of legitimately seen with real function redefinition
389 errors. I would argue that the front end should never present us with
390 such a case, but don't enforce that for now. */
391 gcc_assert (!process);
392
393 /* Reset our data structures so we can analyze the function again. */
394 inlined_to = NULL;
395 memset (&rtl, 0, sizeof (rtl));
396 analyzed = false;
397 definition = false;
398 alias = false;
399 transparent_alias = false;
400 weakref = false;
401 cpp_implicit_alias = false;
402
403 remove_callees ();
404 remove_all_references ();
405 }
406
407 /* Return true when there are references to the node. INCLUDE_SELF is
408 true if a self reference counts as a reference. */
409
410 bool
411 symtab_node::referred_to_p (bool include_self)
412 {
413 ipa_ref *ref = NULL;
414
415 /* See if there are any references at all. */
416 if (iterate_referring (0, ref))
417 return true;
418 /* For functions check also calls. */
419 cgraph_node *cn = dyn_cast <cgraph_node *> (this);
420 if (cn && cn->callers)
421 {
422 if (include_self)
423 return true;
424 for (cgraph_edge *e = cn->callers; e; e = e->next_caller)
425 if (e->caller != this)
426 return true;
427 }
428 return false;
429 }
430
431 /* DECL has been parsed. Take it, queue it, compile it at the whim of the
432 logic in effect. If NO_COLLECT is true, then our caller cannot stand to have
433 the garbage collector run at the moment. We would need to either create
434 a new GC context, or just not compile right now. */
435
436 void
437 cgraph_node::finalize_function (tree decl, bool no_collect)
438 {
439 cgraph_node *node = cgraph_node::get_create (decl);
440
441 if (node->definition)
442 {
443 /* Nested functions should only be defined once. */
444 gcc_assert (!DECL_CONTEXT (decl)
445 || TREE_CODE (DECL_CONTEXT (decl)) != FUNCTION_DECL);
446 node->reset ();
447 node->redefined_extern_inline = true;
448 }
449
450 /* Set definition first before calling notice_global_symbol so that
451 it is available to notice_global_symbol. */
452 node->definition = true;
453 notice_global_symbol (decl);
454 node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
455 if (!flag_toplevel_reorder)
456 node->no_reorder = true;
457
458 /* With -fkeep-inline-functions we are keeping all inline functions except
459 for extern inline ones. */
460 if (flag_keep_inline_functions
461 && DECL_DECLARED_INLINE_P (decl)
462 && !DECL_EXTERNAL (decl)
463 && !DECL_DISREGARD_INLINE_LIMITS (decl))
464 node->force_output = 1;
465
466 /* __RTL functions were already output as soon as they were parsed (due
467 to the large amount of global state in the backend).
468 Mark such functions as "force_output" to reflect the fact that they
469 will be in the asm file when considering the symbols they reference.
470 The attempt to output them later on will bail out immediately. */
471 if (node->native_rtl_p ())
472 node->force_output = 1;
473
474 /* When not optimizing, also output the static functions. (see
475 PR24561), but don't do so for always_inline functions, functions
476 declared inline and nested functions. These were optimized out
477 in the original implementation and it is unclear whether we want
478 to change the behavior here. */
479 if (((!opt_for_fn (decl, optimize) || flag_keep_static_functions
480 || node->no_reorder)
481 && !node->cpp_implicit_alias
482 && !DECL_DISREGARD_INLINE_LIMITS (decl)
483 && !DECL_DECLARED_INLINE_P (decl)
484 && !(DECL_CONTEXT (decl)
485 && TREE_CODE (DECL_CONTEXT (decl)) == FUNCTION_DECL))
486 && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
487 node->force_output = 1;
488
489 /* If we've not yet emitted decl, tell the debug info about it. */
490 if (!TREE_ASM_WRITTEN (decl))
491 (*debug_hooks->deferred_inline_function) (decl);
492
493 if (!no_collect)
494 ggc_collect ();
495
496 if (symtab->state == CONSTRUCTION
497 && (node->needed_p () || node->referred_to_p ()))
498 enqueue_node (node);
499 }
500
501 /* Add the function FNDECL to the call graph.
502 Unlike finalize_function, this function is intended to be used
503 by middle end and allows insertion of new function at arbitrary point
504 of compilation. The function can be either in high, low or SSA form
505 GIMPLE.
506
507 The function is assumed to be reachable and have address taken (so no
508 API breaking optimizations are performed on it).
509
510 Main work done by this function is to enqueue the function for later
511 processing to avoid need the passes to be re-entrant. */
512
513 void
514 cgraph_node::add_new_function (tree fndecl, bool lowered)
515 {
516 gcc::pass_manager *passes = g->get_passes ();
517 cgraph_node *node;
518
519 if (dump_file)
520 {
521 struct function *fn = DECL_STRUCT_FUNCTION (fndecl);
522 const char *function_type = ((gimple_has_body_p (fndecl))
523 ? (lowered
524 ? (gimple_in_ssa_p (fn)
525 ? "ssa gimple"
526 : "low gimple")
527 : "high gimple")
528 : "to-be-gimplified");
529 fprintf (dump_file,
530 "Added new %s function %s to callgraph\n",
531 function_type,
532 fndecl_name (fndecl));
533 }
534
535 switch (symtab->state)
536 {
537 case PARSING:
538 cgraph_node::finalize_function (fndecl, false);
539 break;
540 case CONSTRUCTION:
541 /* Just enqueue function to be processed at nearest occurrence. */
542 node = cgraph_node::get_create (fndecl);
543 if (lowered)
544 node->lowered = true;
545 cgraph_new_nodes.safe_push (node);
546 break;
547
548 case IPA:
549 case IPA_SSA:
550 case IPA_SSA_AFTER_INLINING:
551 case EXPANSION:
552 /* Bring the function into finalized state and enqueue for later
553 analyzing and compilation. */
554 node = cgraph_node::get_create (fndecl);
555 node->local = false;
556 node->definition = true;
557 node->force_output = true;
558 if (TREE_PUBLIC (fndecl))
559 node->externally_visible = true;
560 if (!lowered && symtab->state == EXPANSION)
561 {
562 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
563 gimple_register_cfg_hooks ();
564 bitmap_obstack_initialize (NULL);
565 execute_pass_list (cfun, passes->all_lowering_passes);
566 passes->execute_early_local_passes ();
567 bitmap_obstack_release (NULL);
568 pop_cfun ();
569
570 lowered = true;
571 }
572 if (lowered)
573 node->lowered = true;
574 cgraph_new_nodes.safe_push (node);
575 break;
576
577 case FINISHED:
578 /* At the very end of compilation we have to do all the work up
579 to expansion. */
580 node = cgraph_node::create (fndecl);
581 if (lowered)
582 node->lowered = true;
583 node->definition = true;
584 node->analyze ();
585 push_cfun (DECL_STRUCT_FUNCTION (fndecl));
586 gimple_register_cfg_hooks ();
587 bitmap_obstack_initialize (NULL);
588 if (!gimple_in_ssa_p (DECL_STRUCT_FUNCTION (fndecl)))
589 g->get_passes ()->execute_early_local_passes ();
590 bitmap_obstack_release (NULL);
591 pop_cfun ();
592 node->expand ();
593 break;
594
595 default:
596 gcc_unreachable ();
597 }
598
599 /* Set a personality if required and we already passed EH lowering. */
600 if (lowered
601 && (function_needs_eh_personality (DECL_STRUCT_FUNCTION (fndecl))
602 == eh_personality_lang))
603 DECL_FUNCTION_PERSONALITY (fndecl) = lang_hooks.eh_personality ();
604 }
605
606 /* Analyze the function scheduled to be output. */
607 void
608 cgraph_node::analyze (void)
609 {
610 if (native_rtl_p ())
611 {
612 analyzed = true;
613 return;
614 }
615
616 tree decl = this->decl;
617 location_t saved_loc = input_location;
618 input_location = DECL_SOURCE_LOCATION (decl);
619
620 if (thunk.thunk_p)
621 {
622 cgraph_node *t = cgraph_node::get (thunk.alias);
623
624 create_edge (t, NULL, t->count);
625 callees->can_throw_external = !TREE_NOTHROW (t->decl);
626 /* Target code in expand_thunk may need the thunk's target
627 to be analyzed, so recurse here. */
628 if (!t->analyzed && t->definition)
629 t->analyze ();
630 if (t->alias)
631 {
632 t = t->get_alias_target ();
633 if (!t->analyzed && t->definition)
634 t->analyze ();
635 }
636 bool ret = expand_thunk (false, false);
637 thunk.alias = NULL;
638 if (!ret)
639 return;
640 }
641 if (alias)
642 resolve_alias (cgraph_node::get (alias_target), transparent_alias);
643 else if (dispatcher_function)
644 {
645 /* Generate the dispatcher body of multi-versioned functions. */
646 cgraph_function_version_info *dispatcher_version_info
647 = function_version ();
648 if (dispatcher_version_info != NULL
649 && (dispatcher_version_info->dispatcher_resolver
650 == NULL_TREE))
651 {
652 tree resolver = NULL_TREE;
653 gcc_assert (targetm.generate_version_dispatcher_body);
654 resolver = targetm.generate_version_dispatcher_body (this);
655 gcc_assert (resolver != NULL_TREE);
656 }
657 }
658 else
659 {
660 push_cfun (DECL_STRUCT_FUNCTION (decl));
661
662 assign_assembler_name_if_needed (decl);
663
664 /* Make sure to gimplify bodies only once. During analyzing a
665 function we lower it, which will require gimplified nested
666 functions, so we can end up here with an already gimplified
667 body. */
668 if (!gimple_has_body_p (decl))
669 gimplify_function_tree (decl);
670
671 /* Lower the function. */
672 if (!lowered)
673 {
674 if (nested)
675 lower_nested_functions (decl);
676 gcc_assert (!nested);
677
678 gimple_register_cfg_hooks ();
679 bitmap_obstack_initialize (NULL);
680 execute_pass_list (cfun, g->get_passes ()->all_lowering_passes);
681 free_dominance_info (CDI_POST_DOMINATORS);
682 free_dominance_info (CDI_DOMINATORS);
683 compact_blocks ();
684 bitmap_obstack_release (NULL);
685 lowered = true;
686 }
687
688 pop_cfun ();
689 }
690 analyzed = true;
691
692 input_location = saved_loc;
693 }
694
695 /* C++ frontend produce same body aliases all over the place, even before PCH
696 gets streamed out. It relies on us linking the aliases with their function
697 in order to do the fixups, but ipa-ref is not PCH safe. Consequently we
698 first produce aliases without links, but once C++ FE is sure he won't stream
699 PCH we build the links via this function. */
700
701 void
702 symbol_table::process_same_body_aliases (void)
703 {
704 symtab_node *node;
705 FOR_EACH_SYMBOL (node)
706 if (node->cpp_implicit_alias && !node->analyzed)
707 node->resolve_alias
708 (VAR_P (node->alias_target)
709 ? (symtab_node *)varpool_node::get_create (node->alias_target)
710 : (symtab_node *)cgraph_node::get_create (node->alias_target));
711 cpp_implicit_aliases_done = true;
712 }
713
714 /* Process a symver attribute. */
715
716 static void
717 process_symver_attribute (symtab_node *n)
718 {
719 tree value = lookup_attribute ("symver", DECL_ATTRIBUTES (n->decl));
720
721 if (!value)
722 return;
723 if (lookup_attribute ("symver", TREE_CHAIN (value)))
724 {
725 error_at (DECL_SOURCE_LOCATION (n->decl),
726 "multiple versions for one symbol");
727 return;
728 }
729 tree symver = get_identifier_with_length
730 (TREE_STRING_POINTER (TREE_VALUE (TREE_VALUE (value))),
731 TREE_STRING_LENGTH (TREE_VALUE (TREE_VALUE (value))));
732 symtab_node *def = symtab_node::get_for_asmname (symver);
733
734 if (def)
735 {
736 error_at (DECL_SOURCE_LOCATION (n->decl),
737 "duplicate definition of a symbol version");
738 inform (DECL_SOURCE_LOCATION (def->decl),
739 "same version was previously defined here");
740 return;
741 }
742 if (!n->definition)
743 {
744 error_at (DECL_SOURCE_LOCATION (n->decl),
745 "symbol needs to be defined to have a version");
746 return;
747 }
748 if (DECL_COMMON (n->decl))
749 {
750 error_at (DECL_SOURCE_LOCATION (n->decl),
751 "common symbol cannot be versioned");
752 return;
753 }
754 if (DECL_COMDAT (n->decl))
755 {
756 error_at (DECL_SOURCE_LOCATION (n->decl),
757 "comdat symbol cannot be versioned");
758 return;
759 }
760 if (n->weakref)
761 {
762 error_at (DECL_SOURCE_LOCATION (n->decl),
763 "weakref cannot be versioned");
764 return;
765 }
766 if (!TREE_PUBLIC (n->decl))
767 {
768 error_at (DECL_SOURCE_LOCATION (n->decl),
769 "versioned symbol must be public");
770 return;
771 }
772 if (DECL_VISIBILITY (n->decl) != VISIBILITY_DEFAULT)
773 {
774 error_at (DECL_SOURCE_LOCATION (n->decl),
775 "versioned symbol must have default visibility");
776 return;
777 }
778
779 /* Create new symbol table entry representing the version. */
780 tree new_decl = copy_node (n->decl);
781
782 DECL_INITIAL (new_decl) = NULL_TREE;
783 if (TREE_CODE (new_decl) == FUNCTION_DECL)
784 DECL_STRUCT_FUNCTION (new_decl) = NULL;
785 SET_DECL_ASSEMBLER_NAME (new_decl, symver);
786 TREE_PUBLIC (new_decl) = 1;
787 DECL_ATTRIBUTES (new_decl) = NULL;
788
789 symtab_node *symver_node = symtab_node::get_create (new_decl);
790 symver_node->alias = true;
791 symver_node->definition = true;
792 symver_node->symver = true;
793 symver_node->create_reference (n, IPA_REF_ALIAS, NULL);
794 symver_node->analyzed = true;
795 }
796
797 /* Process attributes common for vars and functions. */
798
799 static void
800 process_common_attributes (symtab_node *node, tree decl)
801 {
802 tree weakref = lookup_attribute ("weakref", DECL_ATTRIBUTES (decl));
803
804 if (weakref && !lookup_attribute ("alias", DECL_ATTRIBUTES (decl)))
805 {
806 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
807 "%<weakref%> attribute should be accompanied with"
808 " an %<alias%> attribute");
809 DECL_WEAK (decl) = 0;
810 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
811 DECL_ATTRIBUTES (decl));
812 }
813
814 if (lookup_attribute ("no_reorder", DECL_ATTRIBUTES (decl)))
815 node->no_reorder = 1;
816 process_symver_attribute (node);
817 }
818
819 /* Look for externally_visible and used attributes and mark cgraph nodes
820 accordingly.
821
822 We cannot mark the nodes at the point the attributes are processed (in
823 handle_*_attribute) because the copy of the declarations available at that
824 point may not be canonical. For example, in:
825
826 void f();
827 void f() __attribute__((used));
828
829 the declaration we see in handle_used_attribute will be the second
830 declaration -- but the front end will subsequently merge that declaration
831 with the original declaration and discard the second declaration.
832
833 Furthermore, we can't mark these nodes in finalize_function because:
834
835 void f() {}
836 void f() __attribute__((externally_visible));
837
838 is valid.
839
840 So, we walk the nodes at the end of the translation unit, applying the
841 attributes at that point. */
842
843 static void
844 process_function_and_variable_attributes (cgraph_node *first,
845 varpool_node *first_var)
846 {
847 cgraph_node *node;
848 varpool_node *vnode;
849
850 for (node = symtab->first_function (); node != first;
851 node = symtab->next_function (node))
852 {
853 tree decl = node->decl;
854
855 if (node->alias
856 && lookup_attribute ("flatten", DECL_ATTRIBUTES (decl)))
857 {
858 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
859 "%<flatten%>"
860 " attribute attribute is ignored on aliases");
861 }
862 if (DECL_PRESERVE_P (decl))
863 node->mark_force_output ();
864 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
865 {
866 if (! TREE_PUBLIC (node->decl))
867 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
868 "%<externally_visible%>"
869 " attribute have effect only on public objects");
870 }
871 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
872 && node->definition
873 && (!node->alias || DECL_INITIAL (decl) != error_mark_node))
874 {
875 /* NODE->DEFINITION && NODE->ALIAS is nonzero for valid weakref
876 function declarations; DECL_INITIAL is non-null for invalid
877 weakref functions that are also defined. */
878 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
879 "%<weakref%> attribute ignored"
880 " because function is defined");
881 DECL_WEAK (decl) = 0;
882 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
883 DECL_ATTRIBUTES (decl));
884 DECL_ATTRIBUTES (decl) = remove_attribute ("alias",
885 DECL_ATTRIBUTES (decl));
886 node->alias = false;
887 node->weakref = false;
888 node->transparent_alias = false;
889 }
890 else if (lookup_attribute ("alias", DECL_ATTRIBUTES (decl))
891 && node->definition
892 && !node->alias)
893 warning_at (DECL_SOURCE_LOCATION (node->decl), OPT_Wattributes,
894 "%<alias%> attribute ignored"
895 " because function is defined");
896
897 if (lookup_attribute ("always_inline", DECL_ATTRIBUTES (decl))
898 && !DECL_DECLARED_INLINE_P (decl)
899 /* redefining extern inline function makes it DECL_UNINLINABLE. */
900 && !DECL_UNINLINABLE (decl))
901 warning_at (DECL_SOURCE_LOCATION (decl), OPT_Wattributes,
902 "%<always_inline%> function might not be inlinable");
903
904 process_common_attributes (node, decl);
905 }
906 for (vnode = symtab->first_variable (); vnode != first_var;
907 vnode = symtab->next_variable (vnode))
908 {
909 tree decl = vnode->decl;
910 if (DECL_EXTERNAL (decl)
911 && DECL_INITIAL (decl))
912 varpool_node::finalize_decl (decl);
913 if (DECL_PRESERVE_P (decl))
914 vnode->force_output = true;
915 else if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
916 {
917 if (! TREE_PUBLIC (vnode->decl))
918 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
919 "%<externally_visible%>"
920 " attribute have effect only on public objects");
921 }
922 if (lookup_attribute ("weakref", DECL_ATTRIBUTES (decl))
923 && vnode->definition
924 && DECL_INITIAL (decl))
925 {
926 warning_at (DECL_SOURCE_LOCATION (vnode->decl), OPT_Wattributes,
927 "%<weakref%> attribute ignored"
928 " because variable is initialized");
929 DECL_WEAK (decl) = 0;
930 DECL_ATTRIBUTES (decl) = remove_attribute ("weakref",
931 DECL_ATTRIBUTES (decl));
932 }
933 process_common_attributes (vnode, decl);
934 }
935 }
936
937 /* Mark DECL as finalized. By finalizing the declaration, frontend instruct the
938 middle end to output the variable to asm file, if needed or externally
939 visible. */
940
941 void
942 varpool_node::finalize_decl (tree decl)
943 {
944 varpool_node *node = varpool_node::get_create (decl);
945
946 gcc_assert (TREE_STATIC (decl) || DECL_EXTERNAL (decl));
947
948 if (node->definition)
949 return;
950 /* Set definition first before calling notice_global_symbol so that
951 it is available to notice_global_symbol. */
952 node->definition = true;
953 notice_global_symbol (decl);
954 if (!flag_toplevel_reorder)
955 node->no_reorder = true;
956 if (TREE_THIS_VOLATILE (decl) || DECL_PRESERVE_P (decl)
957 /* Traditionally we do not eliminate static variables when not
958 optimizing and when not doing toplevel reorder. */
959 || (node->no_reorder && !DECL_COMDAT (node->decl)
960 && !DECL_ARTIFICIAL (node->decl)))
961 node->force_output = true;
962
963 if (symtab->state == CONSTRUCTION
964 && (node->needed_p () || node->referred_to_p ()))
965 enqueue_node (node);
966 if (symtab->state >= IPA_SSA)
967 node->analyze ();
968 /* Some frontends produce various interface variables after compilation
969 finished. */
970 if (symtab->state == FINISHED
971 || (node->no_reorder
972 && symtab->state == EXPANSION))
973 node->assemble_decl ();
974 }
975
976 /* EDGE is an polymorphic call. Mark all possible targets as reachable
977 and if there is only one target, perform trivial devirtualization.
978 REACHABLE_CALL_TARGETS collects target lists we already walked to
979 avoid duplicate work. */
980
981 static void
982 walk_polymorphic_call_targets (hash_set<void *> *reachable_call_targets,
983 cgraph_edge *edge)
984 {
985 unsigned int i;
986 void *cache_token;
987 bool final;
988 vec <cgraph_node *>targets
989 = possible_polymorphic_call_targets
990 (edge, &final, &cache_token);
991
992 if (!reachable_call_targets->add (cache_token))
993 {
994 if (symtab->dump_file)
995 dump_possible_polymorphic_call_targets
996 (symtab->dump_file, edge);
997
998 for (i = 0; i < targets.length (); i++)
999 {
1000 /* Do not bother to mark virtual methods in anonymous namespace;
1001 either we will find use of virtual table defining it, or it is
1002 unused. */
1003 if (targets[i]->definition
1004 && TREE_CODE
1005 (TREE_TYPE (targets[i]->decl))
1006 == METHOD_TYPE
1007 && !type_in_anonymous_namespace_p
1008 (TYPE_METHOD_BASETYPE (TREE_TYPE (targets[i]->decl))))
1009 enqueue_node (targets[i]);
1010 }
1011 }
1012
1013 /* Very trivial devirtualization; when the type is
1014 final or anonymous (so we know all its derivation)
1015 and there is only one possible virtual call target,
1016 make the edge direct. */
1017 if (final)
1018 {
1019 if (targets.length () <= 1 && dbg_cnt (devirt))
1020 {
1021 cgraph_node *target;
1022 if (targets.length () == 1)
1023 target = targets[0];
1024 else
1025 target = cgraph_node::create
1026 (builtin_decl_implicit (BUILT_IN_UNREACHABLE));
1027
1028 if (symtab->dump_file)
1029 {
1030 fprintf (symtab->dump_file,
1031 "Devirtualizing call: ");
1032 print_gimple_stmt (symtab->dump_file,
1033 edge->call_stmt, 0,
1034 TDF_SLIM);
1035 }
1036 if (dump_enabled_p ())
1037 {
1038 dump_printf_loc (MSG_OPTIMIZED_LOCATIONS, edge->call_stmt,
1039 "devirtualizing call in %s to %s\n",
1040 edge->caller->dump_name (),
1041 target->dump_name ());
1042 }
1043
1044 edge = cgraph_edge::make_direct (edge, target);
1045 gimple *new_call = cgraph_edge::redirect_call_stmt_to_callee (edge);
1046
1047 if (symtab->dump_file)
1048 {
1049 fprintf (symtab->dump_file, "Devirtualized as: ");
1050 print_gimple_stmt (symtab->dump_file, new_call, 0, TDF_SLIM);
1051 }
1052 }
1053 }
1054 }
1055
1056 /* Issue appropriate warnings for the global declaration DECL. */
1057
1058 static void
1059 check_global_declaration (symtab_node *snode)
1060 {
1061 const char *decl_file;
1062 tree decl = snode->decl;
1063
1064 /* Warn about any function declared static but not defined. We don't
1065 warn about variables, because many programs have static variables
1066 that exist only to get some text into the object file. */
1067 if (TREE_CODE (decl) == FUNCTION_DECL
1068 && DECL_INITIAL (decl) == 0
1069 && DECL_EXTERNAL (decl)
1070 && ! DECL_ARTIFICIAL (decl)
1071 && ! TREE_PUBLIC (decl))
1072 {
1073 if (TREE_NO_WARNING (decl))
1074 ;
1075 else if (snode->referred_to_p (/*include_self=*/false))
1076 pedwarn (input_location, 0, "%q+F used but never defined", decl);
1077 else
1078 warning (OPT_Wunused_function, "%q+F declared %<static%> but never "
1079 "defined", decl);
1080 /* This symbol is effectively an "extern" declaration now. */
1081 TREE_PUBLIC (decl) = 1;
1082 }
1083
1084 /* Warn about static fns or vars defined but not used. */
1085 if (((warn_unused_function && TREE_CODE (decl) == FUNCTION_DECL)
1086 || (((warn_unused_variable && ! TREE_READONLY (decl))
1087 || (warn_unused_const_variable > 0 && TREE_READONLY (decl)
1088 && (warn_unused_const_variable == 2
1089 || (main_input_filename != NULL
1090 && (decl_file = DECL_SOURCE_FILE (decl)) != NULL
1091 && filename_cmp (main_input_filename,
1092 decl_file) == 0))))
1093 && VAR_P (decl)))
1094 && ! DECL_IN_SYSTEM_HEADER (decl)
1095 && ! snode->referred_to_p (/*include_self=*/false)
1096 /* This TREE_USED check is needed in addition to referred_to_p
1097 above, because the `__unused__' attribute is not being
1098 considered for referred_to_p. */
1099 && ! TREE_USED (decl)
1100 /* The TREE_USED bit for file-scope decls is kept in the identifier,
1101 to handle multiple external decls in different scopes. */
1102 && ! (DECL_NAME (decl) && TREE_USED (DECL_NAME (decl)))
1103 && ! DECL_EXTERNAL (decl)
1104 && ! DECL_ARTIFICIAL (decl)
1105 && ! DECL_ABSTRACT_ORIGIN (decl)
1106 && ! TREE_PUBLIC (decl)
1107 /* A volatile variable might be used in some non-obvious way. */
1108 && (! VAR_P (decl) || ! TREE_THIS_VOLATILE (decl))
1109 /* Global register variables must be declared to reserve them. */
1110 && ! (VAR_P (decl) && DECL_REGISTER (decl))
1111 /* Global ctors and dtors are called by the runtime. */
1112 && (TREE_CODE (decl) != FUNCTION_DECL
1113 || (!DECL_STATIC_CONSTRUCTOR (decl)
1114 && !DECL_STATIC_DESTRUCTOR (decl)))
1115 /* Otherwise, ask the language. */
1116 && lang_hooks.decls.warn_unused_global (decl))
1117 warning_at (DECL_SOURCE_LOCATION (decl),
1118 (TREE_CODE (decl) == FUNCTION_DECL)
1119 ? OPT_Wunused_function
1120 : (TREE_READONLY (decl)
1121 ? OPT_Wunused_const_variable_
1122 : OPT_Wunused_variable),
1123 "%qD defined but not used", decl);
1124 }
1125
1126 /* Discover all functions and variables that are trivially needed, analyze
1127 them as well as all functions and variables referred by them */
1128 static cgraph_node *first_analyzed;
1129 static varpool_node *first_analyzed_var;
1130
1131 /* FIRST_TIME is set to TRUE for the first time we are called for a
1132 translation unit from finalize_compilation_unit() or false
1133 otherwise. */
1134
1135 static void
1136 analyze_functions (bool first_time)
1137 {
1138 /* Keep track of already processed nodes when called multiple times for
1139 intermodule optimization. */
1140 cgraph_node *first_handled = first_analyzed;
1141 varpool_node *first_handled_var = first_analyzed_var;
1142 hash_set<void *> reachable_call_targets;
1143
1144 symtab_node *node;
1145 symtab_node *next;
1146 int i;
1147 ipa_ref *ref;
1148 bool changed = true;
1149 location_t saved_loc = input_location;
1150
1151 bitmap_obstack_initialize (NULL);
1152 symtab->state = CONSTRUCTION;
1153 input_location = UNKNOWN_LOCATION;
1154
1155 /* Ugly, but the fixup cannot happen at a time same body alias is created;
1156 C++ FE is confused about the COMDAT groups being right. */
1157 if (symtab->cpp_implicit_aliases_done)
1158 FOR_EACH_SYMBOL (node)
1159 if (node->cpp_implicit_alias)
1160 node->fixup_same_cpp_alias_visibility (node->get_alias_target ());
1161 build_type_inheritance_graph ();
1162
1163 /* Analysis adds static variables that in turn adds references to new functions.
1164 So we need to iterate the process until it stabilize. */
1165 while (changed)
1166 {
1167 changed = false;
1168 process_function_and_variable_attributes (first_analyzed,
1169 first_analyzed_var);
1170
1171 /* First identify the trivially needed symbols. */
1172 for (node = symtab->first_symbol ();
1173 node != first_analyzed
1174 && node != first_analyzed_var; node = node->next)
1175 {
1176 /* Convert COMDAT group designators to IDENTIFIER_NODEs. */
1177 node->get_comdat_group_id ();
1178 if (node->needed_p ())
1179 {
1180 enqueue_node (node);
1181 if (!changed && symtab->dump_file)
1182 fprintf (symtab->dump_file, "Trivially needed symbols:");
1183 changed = true;
1184 if (symtab->dump_file)
1185 fprintf (symtab->dump_file, " %s", node->dump_asm_name ());
1186 if (!changed && symtab->dump_file)
1187 fprintf (symtab->dump_file, "\n");
1188 }
1189 if (node == first_analyzed
1190 || node == first_analyzed_var)
1191 break;
1192 }
1193 symtab->process_new_functions ();
1194 first_analyzed_var = symtab->first_variable ();
1195 first_analyzed = symtab->first_function ();
1196
1197 if (changed && symtab->dump_file)
1198 fprintf (symtab->dump_file, "\n");
1199
1200 /* Lower representation, build callgraph edges and references for all trivially
1201 needed symbols and all symbols referred by them. */
1202 while (queued_nodes != &symtab_terminator)
1203 {
1204 changed = true;
1205 node = queued_nodes;
1206 queued_nodes = (symtab_node *)queued_nodes->aux;
1207 cgraph_node *cnode = dyn_cast <cgraph_node *> (node);
1208 if (cnode && cnode->definition)
1209 {
1210 cgraph_edge *edge;
1211 tree decl = cnode->decl;
1212
1213 /* ??? It is possible to create extern inline function
1214 and later using weak alias attribute to kill its body.
1215 See gcc.c-torture/compile/20011119-1.c */
1216 if (!DECL_STRUCT_FUNCTION (decl)
1217 && !cnode->alias
1218 && !cnode->thunk.thunk_p
1219 && !cnode->dispatcher_function)
1220 {
1221 cnode->reset ();
1222 cnode->redefined_extern_inline = true;
1223 continue;
1224 }
1225
1226 if (!cnode->analyzed)
1227 cnode->analyze ();
1228
1229 for (edge = cnode->callees; edge; edge = edge->next_callee)
1230 if (edge->callee->definition
1231 && (!DECL_EXTERNAL (edge->callee->decl)
1232 /* When not optimizing, do not try to analyze extern
1233 inline functions. Doing so is pointless. */
1234 || opt_for_fn (edge->callee->decl, optimize)
1235 /* Weakrefs needs to be preserved. */
1236 || edge->callee->alias
1237 /* always_inline functions are inlined even at -O0. */
1238 || lookup_attribute
1239 ("always_inline",
1240 DECL_ATTRIBUTES (edge->callee->decl))
1241 /* Multiversioned functions needs the dispatcher to
1242 be produced locally even for extern functions. */
1243 || edge->callee->function_version ()))
1244 enqueue_node (edge->callee);
1245 if (opt_for_fn (cnode->decl, optimize)
1246 && opt_for_fn (cnode->decl, flag_devirtualize))
1247 {
1248 cgraph_edge *next;
1249
1250 for (edge = cnode->indirect_calls; edge; edge = next)
1251 {
1252 next = edge->next_callee;
1253 if (edge->indirect_info->polymorphic)
1254 walk_polymorphic_call_targets (&reachable_call_targets,
1255 edge);
1256 }
1257 }
1258
1259 /* If decl is a clone of an abstract function,
1260 mark that abstract function so that we don't release its body.
1261 The DECL_INITIAL() of that abstract function declaration
1262 will be later needed to output debug info. */
1263 if (DECL_ABSTRACT_ORIGIN (decl))
1264 {
1265 cgraph_node *origin_node
1266 = cgraph_node::get_create (DECL_ABSTRACT_ORIGIN (decl));
1267 origin_node->used_as_abstract_origin = true;
1268 }
1269 /* Preserve a functions function context node. It will
1270 later be needed to output debug info. */
1271 if (tree fn = decl_function_context (decl))
1272 {
1273 cgraph_node *origin_node = cgraph_node::get_create (fn);
1274 enqueue_node (origin_node);
1275 }
1276 }
1277 else
1278 {
1279 varpool_node *vnode = dyn_cast <varpool_node *> (node);
1280 if (vnode && vnode->definition && !vnode->analyzed)
1281 vnode->analyze ();
1282 }
1283
1284 if (node->same_comdat_group)
1285 {
1286 symtab_node *next;
1287 for (next = node->same_comdat_group;
1288 next != node;
1289 next = next->same_comdat_group)
1290 if (!next->comdat_local_p ())
1291 enqueue_node (next);
1292 }
1293 for (i = 0; node->iterate_reference (i, ref); i++)
1294 if (ref->referred->definition
1295 && (!DECL_EXTERNAL (ref->referred->decl)
1296 || ((TREE_CODE (ref->referred->decl) != FUNCTION_DECL
1297 && optimize)
1298 || (TREE_CODE (ref->referred->decl) == FUNCTION_DECL
1299 && opt_for_fn (ref->referred->decl, optimize))
1300 || node->alias
1301 || ref->referred->alias)))
1302 enqueue_node (ref->referred);
1303 symtab->process_new_functions ();
1304 }
1305 }
1306 update_type_inheritance_graph ();
1307
1308 /* Collect entry points to the unit. */
1309 if (symtab->dump_file)
1310 {
1311 fprintf (symtab->dump_file, "\n\nInitial ");
1312 symtab->dump (symtab->dump_file);
1313 }
1314
1315 if (first_time)
1316 {
1317 symtab_node *snode;
1318 FOR_EACH_SYMBOL (snode)
1319 check_global_declaration (snode);
1320 }
1321
1322 if (symtab->dump_file)
1323 fprintf (symtab->dump_file, "\nRemoving unused symbols:");
1324
1325 for (node = symtab->first_symbol ();
1326 node != first_handled
1327 && node != first_handled_var; node = next)
1328 {
1329 next = node->next;
1330 /* For symbols declared locally we clear TREE_READONLY when emitting
1331 the constructor (if one is needed). For external declarations we can
1332 not safely assume that the type is readonly because we may be called
1333 during its construction. */
1334 if (TREE_CODE (node->decl) == VAR_DECL
1335 && TYPE_P (TREE_TYPE (node->decl))
1336 && TYPE_NEEDS_CONSTRUCTING (TREE_TYPE (node->decl))
1337 && DECL_EXTERNAL (node->decl))
1338 TREE_READONLY (node->decl) = 0;
1339 if (!node->aux && !node->referred_to_p ())
1340 {
1341 if (symtab->dump_file)
1342 fprintf (symtab->dump_file, " %s", node->dump_name ());
1343
1344 /* See if the debugger can use anything before the DECL
1345 passes away. Perhaps it can notice a DECL that is now a
1346 constant and can tag the early DIE with an appropriate
1347 attribute.
1348
1349 Otherwise, this is the last chance the debug_hooks have
1350 at looking at optimized away DECLs, since
1351 late_global_decl will subsequently be called from the
1352 contents of the now pruned symbol table. */
1353 if (VAR_P (node->decl)
1354 && !decl_function_context (node->decl))
1355 {
1356 /* We are reclaiming totally unreachable code and variables
1357 so they effectively appear as readonly. Show that to
1358 the debug machinery. */
1359 TREE_READONLY (node->decl) = 1;
1360 node->definition = false;
1361 (*debug_hooks->late_global_decl) (node->decl);
1362 }
1363
1364 node->remove ();
1365 continue;
1366 }
1367 if (cgraph_node *cnode = dyn_cast <cgraph_node *> (node))
1368 {
1369 tree decl = node->decl;
1370
1371 if (cnode->definition && !gimple_has_body_p (decl)
1372 && !cnode->alias
1373 && !cnode->thunk.thunk_p)
1374 cnode->reset ();
1375
1376 gcc_assert (!cnode->definition || cnode->thunk.thunk_p
1377 || cnode->alias
1378 || gimple_has_body_p (decl)
1379 || cnode->native_rtl_p ());
1380 gcc_assert (cnode->analyzed == cnode->definition);
1381 }
1382 node->aux = NULL;
1383 }
1384 for (;node; node = node->next)
1385 node->aux = NULL;
1386 first_analyzed = symtab->first_function ();
1387 first_analyzed_var = symtab->first_variable ();
1388 if (symtab->dump_file)
1389 {
1390 fprintf (symtab->dump_file, "\n\nReclaimed ");
1391 symtab->dump (symtab->dump_file);
1392 }
1393 bitmap_obstack_release (NULL);
1394 ggc_collect ();
1395 /* Initialize assembler name hash, in particular we want to trigger C++
1396 mangling and same body alias creation before we free DECL_ARGUMENTS
1397 used by it. */
1398 if (!seen_error ())
1399 symtab->symtab_initialize_asm_name_hash ();
1400
1401 input_location = saved_loc;
1402 }
1403
1404 /* Check declaration of the type of ALIAS for compatibility with its TARGET
1405 (which may be an ifunc resolver) and issue a diagnostic when they are
1406 not compatible according to language rules (plus a C++ extension for
1407 non-static member functions). */
1408
1409 static void
1410 maybe_diag_incompatible_alias (tree alias, tree target)
1411 {
1412 tree altype = TREE_TYPE (alias);
1413 tree targtype = TREE_TYPE (target);
1414
1415 bool ifunc = cgraph_node::get (alias)->ifunc_resolver;
1416 tree funcptr = altype;
1417
1418 if (ifunc)
1419 {
1420 /* Handle attribute ifunc first. */
1421 if (TREE_CODE (altype) == METHOD_TYPE)
1422 {
1423 /* Set FUNCPTR to the type of the alias target. If the type
1424 is a non-static member function of class C, construct a type
1425 of an ordinary function taking C* as the first argument,
1426 followed by the member function argument list, and use it
1427 instead to check for incompatibility. This conversion is
1428 not defined by the language but an extension provided by
1429 G++. */
1430
1431 tree rettype = TREE_TYPE (altype);
1432 tree args = TYPE_ARG_TYPES (altype);
1433 altype = build_function_type (rettype, args);
1434 funcptr = altype;
1435 }
1436
1437 targtype = TREE_TYPE (targtype);
1438
1439 if (POINTER_TYPE_P (targtype))
1440 {
1441 targtype = TREE_TYPE (targtype);
1442
1443 /* Only issue Wattribute-alias for conversions to void* with
1444 -Wextra. */
1445 if (VOID_TYPE_P (targtype) && !extra_warnings)
1446 return;
1447
1448 /* Proceed to handle incompatible ifunc resolvers below. */
1449 }
1450 else
1451 {
1452 funcptr = build_pointer_type (funcptr);
1453
1454 error_at (DECL_SOURCE_LOCATION (target),
1455 "%<ifunc%> resolver for %qD must return %qT",
1456 alias, funcptr);
1457 inform (DECL_SOURCE_LOCATION (alias),
1458 "resolver indirect function declared here");
1459 return;
1460 }
1461 }
1462
1463 if ((!FUNC_OR_METHOD_TYPE_P (targtype)
1464 || (prototype_p (altype)
1465 && prototype_p (targtype)
1466 && !types_compatible_p (altype, targtype))))
1467 {
1468 /* Warn for incompatibilities. Avoid warning for functions
1469 without a prototype to make it possible to declare aliases
1470 without knowing the exact type, as libstdc++ does. */
1471 if (ifunc)
1472 {
1473 funcptr = build_pointer_type (funcptr);
1474
1475 auto_diagnostic_group d;
1476 if (warning_at (DECL_SOURCE_LOCATION (target),
1477 OPT_Wattribute_alias_,
1478 "%<ifunc%> resolver for %qD should return %qT",
1479 alias, funcptr))
1480 inform (DECL_SOURCE_LOCATION (alias),
1481 "resolver indirect function declared here");
1482 }
1483 else
1484 {
1485 auto_diagnostic_group d;
1486 if (warning_at (DECL_SOURCE_LOCATION (alias),
1487 OPT_Wattribute_alias_,
1488 "%qD alias between functions of incompatible "
1489 "types %qT and %qT", alias, altype, targtype))
1490 inform (DECL_SOURCE_LOCATION (target),
1491 "aliased declaration here");
1492 }
1493 }
1494 }
1495
1496 /* Translate the ugly representation of aliases as alias pairs into nice
1497 representation in callgraph. We don't handle all cases yet,
1498 unfortunately. */
1499
1500 static void
1501 handle_alias_pairs (void)
1502 {
1503 alias_pair *p;
1504 unsigned i;
1505
1506 for (i = 0; alias_pairs && alias_pairs->iterate (i, &p);)
1507 {
1508 symtab_node *target_node = symtab_node::get_for_asmname (p->target);
1509
1510 /* Weakrefs with target not defined in current unit are easy to handle:
1511 they behave just as external variables except we need to note the
1512 alias flag to later output the weakref pseudo op into asm file. */
1513 if (!target_node
1514 && lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)) != NULL)
1515 {
1516 symtab_node *node = symtab_node::get (p->decl);
1517 if (node)
1518 {
1519 node->alias_target = p->target;
1520 node->weakref = true;
1521 node->alias = true;
1522 node->transparent_alias = true;
1523 }
1524 alias_pairs->unordered_remove (i);
1525 continue;
1526 }
1527 else if (!target_node)
1528 {
1529 error ("%q+D aliased to undefined symbol %qE", p->decl, p->target);
1530 symtab_node *node = symtab_node::get (p->decl);
1531 if (node)
1532 node->alias = false;
1533 alias_pairs->unordered_remove (i);
1534 continue;
1535 }
1536
1537 if (DECL_EXTERNAL (target_node->decl)
1538 /* We use local aliases for C++ thunks to force the tailcall
1539 to bind locally. This is a hack - to keep it working do
1540 the following (which is not strictly correct). */
1541 && (TREE_CODE (target_node->decl) != FUNCTION_DECL
1542 || ! DECL_VIRTUAL_P (target_node->decl))
1543 && ! lookup_attribute ("weakref", DECL_ATTRIBUTES (p->decl)))
1544 {
1545 error ("%q+D aliased to external symbol %qE",
1546 p->decl, p->target);
1547 }
1548
1549 if (TREE_CODE (p->decl) == FUNCTION_DECL
1550 && target_node && is_a <cgraph_node *> (target_node))
1551 {
1552 maybe_diag_incompatible_alias (p->decl, target_node->decl);
1553
1554 maybe_diag_alias_attributes (p->decl, target_node->decl);
1555
1556 cgraph_node *src_node = cgraph_node::get (p->decl);
1557 if (src_node && src_node->definition)
1558 src_node->reset ();
1559 cgraph_node::create_alias (p->decl, target_node->decl);
1560 alias_pairs->unordered_remove (i);
1561 }
1562 else if (VAR_P (p->decl)
1563 && target_node && is_a <varpool_node *> (target_node))
1564 {
1565 varpool_node::create_alias (p->decl, target_node->decl);
1566 alias_pairs->unordered_remove (i);
1567 }
1568 else
1569 {
1570 error ("%q+D alias between function and variable is not supported",
1571 p->decl);
1572 inform (DECL_SOURCE_LOCATION (target_node->decl),
1573 "aliased declaration here");
1574
1575 alias_pairs->unordered_remove (i);
1576 }
1577 }
1578 vec_free (alias_pairs);
1579 }
1580
1581
1582 /* Figure out what functions we want to assemble. */
1583
1584 static void
1585 mark_functions_to_output (void)
1586 {
1587 bool check_same_comdat_groups = false;
1588 cgraph_node *node;
1589
1590 if (flag_checking)
1591 FOR_EACH_FUNCTION (node)
1592 gcc_assert (!node->process);
1593
1594 FOR_EACH_FUNCTION (node)
1595 {
1596 tree decl = node->decl;
1597
1598 gcc_assert (!node->process || node->same_comdat_group);
1599 if (node->process)
1600 continue;
1601
1602 /* We need to output all local functions that are used and not
1603 always inlined, as well as those that are reachable from
1604 outside the current compilation unit. */
1605 if (node->analyzed
1606 && !node->thunk.thunk_p
1607 && !node->alias
1608 && !node->inlined_to
1609 && !TREE_ASM_WRITTEN (decl)
1610 && !DECL_EXTERNAL (decl))
1611 {
1612 node->process = 1;
1613 if (node->same_comdat_group)
1614 {
1615 cgraph_node *next;
1616 for (next = dyn_cast<cgraph_node *> (node->same_comdat_group);
1617 next != node;
1618 next = dyn_cast<cgraph_node *> (next->same_comdat_group))
1619 if (!next->thunk.thunk_p && !next->alias
1620 && !next->comdat_local_p ())
1621 next->process = 1;
1622 }
1623 }
1624 else if (node->same_comdat_group)
1625 {
1626 if (flag_checking)
1627 check_same_comdat_groups = true;
1628 }
1629 else
1630 {
1631 /* We should've reclaimed all functions that are not needed. */
1632 if (flag_checking
1633 && !node->inlined_to
1634 && gimple_has_body_p (decl)
1635 /* FIXME: in ltrans unit when offline copy is outside partition but inline copies
1636 are inside partition, we can end up not removing the body since we no longer
1637 have analyzed node pointing to it. */
1638 && !node->in_other_partition
1639 && !node->alias
1640 && !node->clones
1641 && !DECL_EXTERNAL (decl))
1642 {
1643 node->debug ();
1644 internal_error ("failed to reclaim unneeded function");
1645 }
1646 gcc_assert (node->inlined_to
1647 || !gimple_has_body_p (decl)
1648 || node->in_other_partition
1649 || node->clones
1650 || DECL_ARTIFICIAL (decl)
1651 || DECL_EXTERNAL (decl));
1652
1653 }
1654
1655 }
1656 if (flag_checking && check_same_comdat_groups)
1657 FOR_EACH_FUNCTION (node)
1658 if (node->same_comdat_group && !node->process)
1659 {
1660 tree decl = node->decl;
1661 if (!node->inlined_to
1662 && gimple_has_body_p (decl)
1663 /* FIXME: in an ltrans unit when the offline copy is outside a
1664 partition but inline copies are inside a partition, we can
1665 end up not removing the body since we no longer have an
1666 analyzed node pointing to it. */
1667 && !node->in_other_partition
1668 && !node->clones
1669 && !DECL_EXTERNAL (decl))
1670 {
1671 node->debug ();
1672 internal_error ("failed to reclaim unneeded function in same "
1673 "comdat group");
1674 }
1675 }
1676 }
1677
1678 /* DECL is FUNCTION_DECL. Initialize datastructures so DECL is a function
1679 in lowered gimple form. IN_SSA is true if the gimple is in SSA.
1680
1681 Set current_function_decl and cfun to newly constructed empty function body.
1682 return basic block in the function body. */
1683
1684 basic_block
1685 init_lowered_empty_function (tree decl, bool in_ssa, profile_count count)
1686 {
1687 basic_block bb;
1688 edge e;
1689
1690 current_function_decl = decl;
1691 allocate_struct_function (decl, false);
1692 gimple_register_cfg_hooks ();
1693 init_empty_tree_cfg ();
1694 init_tree_ssa (cfun);
1695
1696 if (in_ssa)
1697 {
1698 init_ssa_operands (cfun);
1699 cfun->gimple_df->in_ssa_p = true;
1700 cfun->curr_properties |= PROP_ssa;
1701 }
1702
1703 DECL_INITIAL (decl) = make_node (BLOCK);
1704 BLOCK_SUPERCONTEXT (DECL_INITIAL (decl)) = decl;
1705
1706 DECL_SAVED_TREE (decl) = error_mark_node;
1707 cfun->curr_properties |= (PROP_gimple_lcf | PROP_gimple_leh | PROP_gimple_any
1708 | PROP_cfg | PROP_loops);
1709
1710 set_loops_for_fn (cfun, ggc_cleared_alloc<loops> ());
1711 init_loops_structure (cfun, loops_for_fn (cfun), 1);
1712 loops_for_fn (cfun)->state |= LOOPS_MAY_HAVE_MULTIPLE_LATCHES;
1713
1714 /* Create BB for body of the function and connect it properly. */
1715 ENTRY_BLOCK_PTR_FOR_FN (cfun)->count = count;
1716 EXIT_BLOCK_PTR_FOR_FN (cfun)->count = count;
1717 bb = create_basic_block (NULL, ENTRY_BLOCK_PTR_FOR_FN (cfun));
1718 bb->count = count;
1719 e = make_edge (ENTRY_BLOCK_PTR_FOR_FN (cfun), bb, EDGE_FALLTHRU);
1720 e->probability = profile_probability::always ();
1721 e = make_edge (bb, EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
1722 e->probability = profile_probability::always ();
1723 add_bb_to_loop (bb, ENTRY_BLOCK_PTR_FOR_FN (cfun)->loop_father);
1724
1725 return bb;
1726 }
1727
1728 /* Adjust PTR by the constant FIXED_OFFSET, by the vtable offset indicated by
1729 VIRTUAL_OFFSET, and by the indirect offset indicated by INDIRECT_OFFSET, if
1730 it is non-null. THIS_ADJUSTING is nonzero for a this adjusting thunk and zero
1731 for a result adjusting thunk. */
1732
1733 tree
1734 thunk_adjust (gimple_stmt_iterator * bsi,
1735 tree ptr, bool this_adjusting,
1736 HOST_WIDE_INT fixed_offset, tree virtual_offset,
1737 HOST_WIDE_INT indirect_offset)
1738 {
1739 gassign *stmt;
1740 tree ret;
1741
1742 if (this_adjusting
1743 && fixed_offset != 0)
1744 {
1745 stmt = gimple_build_assign
1746 (ptr, fold_build_pointer_plus_hwi_loc (input_location,
1747 ptr,
1748 fixed_offset));
1749 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1750 }
1751
1752 if (!vtable_entry_type && (virtual_offset || indirect_offset != 0))
1753 {
1754 tree vfunc_type = make_node (FUNCTION_TYPE);
1755 TREE_TYPE (vfunc_type) = integer_type_node;
1756 TYPE_ARG_TYPES (vfunc_type) = NULL_TREE;
1757 layout_type (vfunc_type);
1758
1759 vtable_entry_type = build_pointer_type (vfunc_type);
1760 }
1761
1762 /* If there's a virtual offset, look up that value in the vtable and
1763 adjust the pointer again. */
1764 if (virtual_offset)
1765 {
1766 tree vtabletmp;
1767 tree vtabletmp2;
1768 tree vtabletmp3;
1769
1770 vtabletmp =
1771 create_tmp_reg (build_pointer_type
1772 (build_pointer_type (vtable_entry_type)), "vptr");
1773
1774 /* The vptr is always at offset zero in the object. */
1775 stmt = gimple_build_assign (vtabletmp,
1776 build1 (NOP_EXPR, TREE_TYPE (vtabletmp),
1777 ptr));
1778 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1779
1780 /* Form the vtable address. */
1781 vtabletmp2 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp)),
1782 "vtableaddr");
1783 stmt = gimple_build_assign (vtabletmp2,
1784 build_simple_mem_ref (vtabletmp));
1785 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1786
1787 /* Find the entry with the vcall offset. */
1788 stmt = gimple_build_assign (vtabletmp2,
1789 fold_build_pointer_plus_loc (input_location,
1790 vtabletmp2,
1791 virtual_offset));
1792 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1793
1794 /* Get the offset itself. */
1795 vtabletmp3 = create_tmp_reg (TREE_TYPE (TREE_TYPE (vtabletmp2)),
1796 "vcalloffset");
1797 stmt = gimple_build_assign (vtabletmp3,
1798 build_simple_mem_ref (vtabletmp2));
1799 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1800
1801 /* Adjust the `this' pointer. */
1802 ptr = fold_build_pointer_plus_loc (input_location, ptr, vtabletmp3);
1803 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1804 GSI_CONTINUE_LINKING);
1805 }
1806
1807 /* Likewise for an offset that is stored in the object that contains the
1808 vtable. */
1809 if (indirect_offset != 0)
1810 {
1811 tree offset_ptr, offset_tree;
1812
1813 /* Get the address of the offset. */
1814 offset_ptr
1815 = create_tmp_reg (build_pointer_type
1816 (build_pointer_type (vtable_entry_type)),
1817 "offset_ptr");
1818 stmt = gimple_build_assign (offset_ptr,
1819 build1 (NOP_EXPR, TREE_TYPE (offset_ptr),
1820 ptr));
1821 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1822
1823 stmt = gimple_build_assign
1824 (offset_ptr,
1825 fold_build_pointer_plus_hwi_loc (input_location, offset_ptr,
1826 indirect_offset));
1827 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1828
1829 /* Get the offset itself. */
1830 offset_tree = create_tmp_reg (TREE_TYPE (TREE_TYPE (offset_ptr)),
1831 "offset");
1832 stmt = gimple_build_assign (offset_tree,
1833 build_simple_mem_ref (offset_ptr));
1834 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1835
1836 /* Adjust the `this' pointer. */
1837 ptr = fold_build_pointer_plus_loc (input_location, ptr, offset_tree);
1838 ptr = force_gimple_operand_gsi (bsi, ptr, true, NULL_TREE, false,
1839 GSI_CONTINUE_LINKING);
1840 }
1841
1842 if (!this_adjusting
1843 && fixed_offset != 0)
1844 /* Adjust the pointer by the constant. */
1845 {
1846 tree ptrtmp;
1847
1848 if (VAR_P (ptr))
1849 ptrtmp = ptr;
1850 else
1851 {
1852 ptrtmp = create_tmp_reg (TREE_TYPE (ptr), "ptr");
1853 stmt = gimple_build_assign (ptrtmp, ptr);
1854 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1855 }
1856 ptr = fold_build_pointer_plus_hwi_loc (input_location,
1857 ptrtmp, fixed_offset);
1858 }
1859
1860 /* Emit the statement and gimplify the adjustment expression. */
1861 ret = create_tmp_reg (TREE_TYPE (ptr), "adjusted_this");
1862 stmt = gimple_build_assign (ret, ptr);
1863 gsi_insert_after (bsi, stmt, GSI_NEW_STMT);
1864
1865 return ret;
1866 }
1867
1868 /* Expand thunk NODE to gimple if possible.
1869 When FORCE_GIMPLE_THUNK is true, gimple thunk is created and
1870 no assembler is produced.
1871 When OUTPUT_ASM_THUNK is true, also produce assembler for
1872 thunks that are not lowered. */
1873
1874 bool
1875 cgraph_node::expand_thunk (bool output_asm_thunks, bool force_gimple_thunk)
1876 {
1877 bool this_adjusting = thunk.this_adjusting;
1878 HOST_WIDE_INT fixed_offset = thunk.fixed_offset;
1879 HOST_WIDE_INT virtual_value = thunk.virtual_value;
1880 HOST_WIDE_INT indirect_offset = thunk.indirect_offset;
1881 tree virtual_offset = NULL;
1882 tree alias = callees->callee->decl;
1883 tree thunk_fndecl = decl;
1884 tree a;
1885
1886 if (!force_gimple_thunk
1887 && this_adjusting
1888 && indirect_offset == 0
1889 && !DECL_EXTERNAL (alias)
1890 && !DECL_STATIC_CHAIN (alias)
1891 && targetm.asm_out.can_output_mi_thunk (thunk_fndecl, fixed_offset,
1892 virtual_value, alias))
1893 {
1894 tree fn_block;
1895 tree restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1896
1897 if (!output_asm_thunks)
1898 {
1899 analyzed = true;
1900 return false;
1901 }
1902
1903 if (in_lto_p)
1904 get_untransformed_body ();
1905 a = DECL_ARGUMENTS (thunk_fndecl);
1906
1907 current_function_decl = thunk_fndecl;
1908
1909 /* Ensure thunks are emitted in their correct sections. */
1910 resolve_unique_section (thunk_fndecl, 0,
1911 flag_function_sections);
1912
1913 DECL_RESULT (thunk_fndecl)
1914 = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
1915 RESULT_DECL, 0, restype);
1916 DECL_CONTEXT (DECL_RESULT (thunk_fndecl)) = thunk_fndecl;
1917
1918 /* The back end expects DECL_INITIAL to contain a BLOCK, so we
1919 create one. */
1920 fn_block = make_node (BLOCK);
1921 BLOCK_VARS (fn_block) = a;
1922 DECL_INITIAL (thunk_fndecl) = fn_block;
1923 BLOCK_SUPERCONTEXT (fn_block) = thunk_fndecl;
1924 allocate_struct_function (thunk_fndecl, false);
1925 init_function_start (thunk_fndecl);
1926 cfun->is_thunk = 1;
1927 insn_locations_init ();
1928 set_curr_insn_location (DECL_SOURCE_LOCATION (thunk_fndecl));
1929 prologue_location = curr_insn_location ();
1930
1931 targetm.asm_out.output_mi_thunk (asm_out_file, thunk_fndecl,
1932 fixed_offset, virtual_value, alias);
1933
1934 insn_locations_finalize ();
1935 init_insn_lengths ();
1936 free_after_compilation (cfun);
1937 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1938 thunk.thunk_p = false;
1939 analyzed = false;
1940 }
1941 else if (stdarg_p (TREE_TYPE (thunk_fndecl)))
1942 {
1943 error ("generic thunk code fails for method %qD which uses %<...%>",
1944 thunk_fndecl);
1945 TREE_ASM_WRITTEN (thunk_fndecl) = 1;
1946 analyzed = true;
1947 return false;
1948 }
1949 else
1950 {
1951 tree restype;
1952 basic_block bb, then_bb, else_bb, return_bb;
1953 gimple_stmt_iterator bsi;
1954 int nargs = 0;
1955 tree arg;
1956 int i;
1957 tree resdecl;
1958 tree restmp = NULL;
1959
1960 gcall *call;
1961 greturn *ret;
1962 bool alias_is_noreturn = TREE_THIS_VOLATILE (alias);
1963
1964 /* We may be called from expand_thunk that releases body except for
1965 DECL_ARGUMENTS. In this case force_gimple_thunk is true. */
1966 if (in_lto_p && !force_gimple_thunk)
1967 get_untransformed_body ();
1968
1969 /* We need to force DECL_IGNORED_P when the thunk is created
1970 after early debug was run. */
1971 if (force_gimple_thunk)
1972 DECL_IGNORED_P (thunk_fndecl) = 1;
1973
1974 a = DECL_ARGUMENTS (thunk_fndecl);
1975
1976 current_function_decl = thunk_fndecl;
1977
1978 /* Ensure thunks are emitted in their correct sections. */
1979 resolve_unique_section (thunk_fndecl, 0,
1980 flag_function_sections);
1981
1982 bitmap_obstack_initialize (NULL);
1983
1984 if (thunk.virtual_offset_p)
1985 virtual_offset = size_int (virtual_value);
1986
1987 /* Build the return declaration for the function. */
1988 restype = TREE_TYPE (TREE_TYPE (thunk_fndecl));
1989 if (DECL_RESULT (thunk_fndecl) == NULL_TREE)
1990 {
1991 resdecl = build_decl (input_location, RESULT_DECL, 0, restype);
1992 DECL_ARTIFICIAL (resdecl) = 1;
1993 DECL_IGNORED_P (resdecl) = 1;
1994 DECL_CONTEXT (resdecl) = thunk_fndecl;
1995 DECL_RESULT (thunk_fndecl) = resdecl;
1996 }
1997 else
1998 resdecl = DECL_RESULT (thunk_fndecl);
1999
2000 profile_count cfg_count = count;
2001 if (!cfg_count.initialized_p ())
2002 cfg_count = profile_count::from_gcov_type (BB_FREQ_MAX).guessed_local ();
2003
2004 bb = then_bb = else_bb = return_bb
2005 = init_lowered_empty_function (thunk_fndecl, true, cfg_count);
2006
2007 bsi = gsi_start_bb (bb);
2008
2009 /* Build call to the function being thunked. */
2010 if (!VOID_TYPE_P (restype)
2011 && (!alias_is_noreturn
2012 || TREE_ADDRESSABLE (restype)
2013 || TREE_CODE (TYPE_SIZE_UNIT (restype)) != INTEGER_CST))
2014 {
2015 if (DECL_BY_REFERENCE (resdecl))
2016 {
2017 restmp = gimple_fold_indirect_ref (resdecl);
2018 if (!restmp)
2019 restmp = build2 (MEM_REF,
2020 TREE_TYPE (TREE_TYPE (resdecl)),
2021 resdecl,
2022 build_int_cst (TREE_TYPE (resdecl), 0));
2023 }
2024 else if (!is_gimple_reg_type (restype))
2025 {
2026 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl)))
2027 {
2028 restmp = resdecl;
2029
2030 if (VAR_P (restmp))
2031 {
2032 add_local_decl (cfun, restmp);
2033 BLOCK_VARS (DECL_INITIAL (current_function_decl))
2034 = restmp;
2035 }
2036 }
2037 else
2038 restmp = create_tmp_var (restype, "retval");
2039 }
2040 else
2041 restmp = create_tmp_reg (restype, "retval");
2042 }
2043
2044 for (arg = a; arg; arg = DECL_CHAIN (arg))
2045 nargs++;
2046 auto_vec<tree> vargs (nargs);
2047 i = 0;
2048 arg = a;
2049 if (this_adjusting)
2050 {
2051 vargs.quick_push (thunk_adjust (&bsi, a, 1, fixed_offset,
2052 virtual_offset, indirect_offset));
2053 arg = DECL_CHAIN (a);
2054 i = 1;
2055 }
2056
2057 if (nargs)
2058 for (; i < nargs; i++, arg = DECL_CHAIN (arg))
2059 {
2060 tree tmp = arg;
2061 if (VECTOR_TYPE_P (TREE_TYPE (arg))
2062 || TREE_CODE (TREE_TYPE (arg)) == COMPLEX_TYPE)
2063 DECL_GIMPLE_REG_P (arg) = 1;
2064
2065 if (!is_gimple_val (arg))
2066 {
2067 tmp = create_tmp_reg (TYPE_MAIN_VARIANT
2068 (TREE_TYPE (arg)), "arg");
2069 gimple *stmt = gimple_build_assign (tmp, arg);
2070 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2071 }
2072 vargs.quick_push (tmp);
2073 }
2074 call = gimple_build_call_vec (build_fold_addr_expr_loc (0, alias), vargs);
2075 callees->call_stmt = call;
2076 gimple_call_set_from_thunk (call, true);
2077 if (DECL_STATIC_CHAIN (alias))
2078 {
2079 tree p = DECL_STRUCT_FUNCTION (alias)->static_chain_decl;
2080 tree type = TREE_TYPE (p);
2081 tree decl = build_decl (DECL_SOURCE_LOCATION (thunk_fndecl),
2082 PARM_DECL, create_tmp_var_name ("CHAIN"),
2083 type);
2084 DECL_ARTIFICIAL (decl) = 1;
2085 DECL_IGNORED_P (decl) = 1;
2086 TREE_USED (decl) = 1;
2087 DECL_CONTEXT (decl) = thunk_fndecl;
2088 DECL_ARG_TYPE (decl) = type;
2089 TREE_READONLY (decl) = 1;
2090
2091 struct function *sf = DECL_STRUCT_FUNCTION (thunk_fndecl);
2092 sf->static_chain_decl = decl;
2093
2094 gimple_call_set_chain (call, decl);
2095 }
2096
2097 /* Return slot optimization is always possible and in fact required to
2098 return values with DECL_BY_REFERENCE. */
2099 if (aggregate_value_p (resdecl, TREE_TYPE (thunk_fndecl))
2100 && (!is_gimple_reg_type (TREE_TYPE (resdecl))
2101 || DECL_BY_REFERENCE (resdecl)))
2102 gimple_call_set_return_slot_opt (call, true);
2103
2104 if (restmp)
2105 {
2106 gimple_call_set_lhs (call, restmp);
2107 gcc_assert (useless_type_conversion_p (TREE_TYPE (restmp),
2108 TREE_TYPE (TREE_TYPE (alias))));
2109 }
2110 gsi_insert_after (&bsi, call, GSI_NEW_STMT);
2111 if (!alias_is_noreturn)
2112 {
2113 if (restmp && !this_adjusting
2114 && (fixed_offset || virtual_offset))
2115 {
2116 tree true_label = NULL_TREE;
2117
2118 if (TREE_CODE (TREE_TYPE (restmp)) == POINTER_TYPE)
2119 {
2120 gimple *stmt;
2121 edge e;
2122 /* If the return type is a pointer, we need to
2123 protect against NULL. We know there will be an
2124 adjustment, because that's why we're emitting a
2125 thunk. */
2126 then_bb = create_basic_block (NULL, bb);
2127 then_bb->count = cfg_count - cfg_count.apply_scale (1, 16);
2128 return_bb = create_basic_block (NULL, then_bb);
2129 return_bb->count = cfg_count;
2130 else_bb = create_basic_block (NULL, else_bb);
2131 else_bb->count = cfg_count.apply_scale (1, 16);
2132 add_bb_to_loop (then_bb, bb->loop_father);
2133 add_bb_to_loop (return_bb, bb->loop_father);
2134 add_bb_to_loop (else_bb, bb->loop_father);
2135 remove_edge (single_succ_edge (bb));
2136 true_label = gimple_block_label (then_bb);
2137 stmt = gimple_build_cond (NE_EXPR, restmp,
2138 build_zero_cst (TREE_TYPE (restmp)),
2139 NULL_TREE, NULL_TREE);
2140 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2141 e = make_edge (bb, then_bb, EDGE_TRUE_VALUE);
2142 e->probability = profile_probability::guessed_always ()
2143 .apply_scale (1, 16);
2144 e = make_edge (bb, else_bb, EDGE_FALSE_VALUE);
2145 e->probability = profile_probability::guessed_always ()
2146 .apply_scale (1, 16);
2147 make_single_succ_edge (return_bb,
2148 EXIT_BLOCK_PTR_FOR_FN (cfun), 0);
2149 make_single_succ_edge (then_bb, return_bb, EDGE_FALLTHRU);
2150 e = make_edge (else_bb, return_bb, EDGE_FALLTHRU);
2151 e->probability = profile_probability::always ();
2152 bsi = gsi_last_bb (then_bb);
2153 }
2154
2155 restmp = thunk_adjust (&bsi, restmp, /*this_adjusting=*/0,
2156 fixed_offset, virtual_offset,
2157 indirect_offset);
2158 if (true_label)
2159 {
2160 gimple *stmt;
2161 bsi = gsi_last_bb (else_bb);
2162 stmt = gimple_build_assign (restmp,
2163 build_zero_cst (TREE_TYPE (restmp)));
2164 gsi_insert_after (&bsi, stmt, GSI_NEW_STMT);
2165 bsi = gsi_last_bb (return_bb);
2166 }
2167 }
2168 else
2169 gimple_call_set_tail (call, true);
2170
2171 /* Build return value. */
2172 if (!DECL_BY_REFERENCE (resdecl))
2173 ret = gimple_build_return (restmp);
2174 else
2175 ret = gimple_build_return (resdecl);
2176
2177 gsi_insert_after (&bsi, ret, GSI_NEW_STMT);
2178 }
2179 else
2180 {
2181 gimple_call_set_tail (call, true);
2182 remove_edge (single_succ_edge (bb));
2183 }
2184
2185 cfun->gimple_df->in_ssa_p = true;
2186 update_max_bb_count ();
2187 profile_status_for_fn (cfun)
2188 = cfg_count.initialized_p () && cfg_count.ipa_p ()
2189 ? PROFILE_READ : PROFILE_GUESSED;
2190 /* FIXME: C++ FE should stop setting TREE_ASM_WRITTEN on thunks. */
2191 TREE_ASM_WRITTEN (thunk_fndecl) = false;
2192 delete_unreachable_blocks ();
2193 update_ssa (TODO_update_ssa);
2194 checking_verify_flow_info ();
2195 free_dominance_info (CDI_DOMINATORS);
2196
2197 /* Since we want to emit the thunk, we explicitly mark its name as
2198 referenced. */
2199 thunk.thunk_p = false;
2200 lowered = true;
2201 bitmap_obstack_release (NULL);
2202 }
2203 current_function_decl = NULL;
2204 set_cfun (NULL);
2205 return true;
2206 }
2207
2208 /* Assemble thunks and aliases associated to node. */
2209
2210 void
2211 cgraph_node::assemble_thunks_and_aliases (void)
2212 {
2213 cgraph_edge *e;
2214 ipa_ref *ref;
2215
2216 for (e = callers; e;)
2217 if (e->caller->thunk.thunk_p
2218 && !e->caller->inlined_to)
2219 {
2220 cgraph_node *thunk = e->caller;
2221
2222 e = e->next_caller;
2223 thunk->expand_thunk (true, false);
2224 thunk->assemble_thunks_and_aliases ();
2225 }
2226 else
2227 e = e->next_caller;
2228
2229 FOR_EACH_ALIAS (this, ref)
2230 {
2231 cgraph_node *alias = dyn_cast <cgraph_node *> (ref->referring);
2232 if (!alias->transparent_alias)
2233 {
2234 bool saved_written = TREE_ASM_WRITTEN (decl);
2235
2236 /* Force assemble_alias to really output the alias this time instead
2237 of buffering it in same alias pairs. */
2238 TREE_ASM_WRITTEN (decl) = 1;
2239 if (alias->symver)
2240 do_assemble_symver (alias->decl,
2241 DECL_ASSEMBLER_NAME (decl));
2242 else
2243 do_assemble_alias (alias->decl,
2244 DECL_ASSEMBLER_NAME (decl));
2245 alias->assemble_thunks_and_aliases ();
2246 TREE_ASM_WRITTEN (decl) = saved_written;
2247 }
2248 }
2249 }
2250
2251 /* Expand function specified by node. */
2252
2253 void
2254 cgraph_node::expand (void)
2255 {
2256 location_t saved_loc;
2257
2258 /* We ought to not compile any inline clones. */
2259 gcc_assert (!inlined_to);
2260
2261 /* __RTL functions are compiled as soon as they are parsed, so don't
2262 do it again. */
2263 if (native_rtl_p ())
2264 return;
2265
2266 announce_function (decl);
2267 process = 0;
2268 gcc_assert (lowered);
2269 get_untransformed_body ();
2270
2271 /* Generate RTL for the body of DECL. */
2272
2273 timevar_push (TV_REST_OF_COMPILATION);
2274
2275 gcc_assert (symtab->global_info_ready);
2276
2277 /* Initialize the default bitmap obstack. */
2278 bitmap_obstack_initialize (NULL);
2279
2280 /* Initialize the RTL code for the function. */
2281 saved_loc = input_location;
2282 input_location = DECL_SOURCE_LOCATION (decl);
2283
2284 gcc_assert (DECL_STRUCT_FUNCTION (decl));
2285 push_cfun (DECL_STRUCT_FUNCTION (decl));
2286 init_function_start (decl);
2287
2288 gimple_register_cfg_hooks ();
2289
2290 bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
2291
2292 update_ssa (TODO_update_ssa_only_virtuals);
2293 execute_all_ipa_transforms (false);
2294
2295 /* Perform all tree transforms and optimizations. */
2296
2297 /* Signal the start of passes. */
2298 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_START, NULL);
2299
2300 execute_pass_list (cfun, g->get_passes ()->all_passes);
2301
2302 /* Signal the end of passes. */
2303 invoke_plugin_callbacks (PLUGIN_ALL_PASSES_END, NULL);
2304
2305 bitmap_obstack_release (&reg_obstack);
2306
2307 /* Release the default bitmap obstack. */
2308 bitmap_obstack_release (NULL);
2309
2310 /* If requested, warn about function definitions where the function will
2311 return a value (usually of some struct or union type) which itself will
2312 take up a lot of stack space. */
2313 if (!DECL_EXTERNAL (decl) && TREE_TYPE (decl))
2314 {
2315 tree ret_type = TREE_TYPE (TREE_TYPE (decl));
2316
2317 if (ret_type && TYPE_SIZE_UNIT (ret_type)
2318 && TREE_CODE (TYPE_SIZE_UNIT (ret_type)) == INTEGER_CST
2319 && compare_tree_int (TYPE_SIZE_UNIT (ret_type),
2320 warn_larger_than_size) > 0)
2321 {
2322 unsigned int size_as_int
2323 = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
2324
2325 if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
2326 warning (OPT_Wlarger_than_,
2327 "size of return value of %q+D is %u bytes",
2328 decl, size_as_int);
2329 else
2330 warning (OPT_Wlarger_than_,
2331 "size of return value of %q+D is larger than %wu bytes",
2332 decl, warn_larger_than_size);
2333 }
2334 }
2335
2336 gimple_set_body (decl, NULL);
2337 if (DECL_STRUCT_FUNCTION (decl) == 0
2338 && !cgraph_node::get (decl)->origin)
2339 {
2340 /* Stop pointing to the local nodes about to be freed.
2341 But DECL_INITIAL must remain nonzero so we know this
2342 was an actual function definition.
2343 For a nested function, this is done in c_pop_function_context.
2344 If rest_of_compilation set this to 0, leave it 0. */
2345 if (DECL_INITIAL (decl) != 0)
2346 DECL_INITIAL (decl) = error_mark_node;
2347 }
2348
2349 input_location = saved_loc;
2350
2351 ggc_collect ();
2352 timevar_pop (TV_REST_OF_COMPILATION);
2353
2354 /* Make sure that BE didn't give up on compiling. */
2355 gcc_assert (TREE_ASM_WRITTEN (decl));
2356 if (cfun)
2357 pop_cfun ();
2358
2359 /* It would make a lot more sense to output thunks before function body to
2360 get more forward and fewer backward jumps. This however would need
2361 solving problem with comdats. See PR48668. Also aliases must come after
2362 function itself to make one pass assemblers, like one on AIX, happy.
2363 See PR 50689.
2364 FIXME: Perhaps thunks should be move before function IFF they are not in
2365 comdat groups. */
2366 assemble_thunks_and_aliases ();
2367 release_body ();
2368 /* Eliminate all call edges. This is important so the GIMPLE_CALL no longer
2369 points to the dead function body. */
2370 remove_callees ();
2371 remove_all_references ();
2372 }
2373
2374 /* Node comparator that is responsible for the order that corresponds
2375 to time when a function was launched for the first time. */
2376
2377 int
2378 tp_first_run_node_cmp (const void *pa, const void *pb)
2379 {
2380 const cgraph_node *a = *(const cgraph_node * const *) pa;
2381 const cgraph_node *b = *(const cgraph_node * const *) pb;
2382 unsigned int tp_first_run_a = a->tp_first_run;
2383 unsigned int tp_first_run_b = b->tp_first_run;
2384
2385 if (!opt_for_fn (a->decl, flag_profile_reorder_functions)
2386 || a->no_reorder)
2387 tp_first_run_a = 0;
2388 if (!opt_for_fn (b->decl, flag_profile_reorder_functions)
2389 || b->no_reorder)
2390 tp_first_run_b = 0;
2391
2392 if (tp_first_run_a == tp_first_run_b)
2393 return a->order - b->order;
2394
2395 /* Functions with time profile must be before these without profile. */
2396 tp_first_run_a = (tp_first_run_a - 1) & INT_MAX;
2397 tp_first_run_b = (tp_first_run_b - 1) & INT_MAX;
2398
2399 return tp_first_run_a - tp_first_run_b;
2400 }
2401
2402 /* Expand all functions that must be output.
2403
2404 Attempt to topologically sort the nodes so function is output when
2405 all called functions are already assembled to allow data to be
2406 propagated across the callgraph. Use a stack to get smaller distance
2407 between a function and its callees (later we may choose to use a more
2408 sophisticated algorithm for function reordering; we will likely want
2409 to use subsections to make the output functions appear in top-down
2410 order). */
2411
2412 static void
2413 expand_all_functions (void)
2414 {
2415 cgraph_node *node;
2416 cgraph_node **order = XCNEWVEC (cgraph_node *,
2417 symtab->cgraph_count);
2418 cgraph_node **tp_first_run_order = XCNEWVEC (cgraph_node *,
2419 symtab->cgraph_count);
2420 unsigned int expanded_func_count = 0, profiled_func_count = 0;
2421 int order_pos, tp_first_run_order_pos = 0, new_order_pos = 0;
2422 int i;
2423
2424 order_pos = ipa_reverse_postorder (order);
2425 gcc_assert (order_pos == symtab->cgraph_count);
2426
2427 /* Garbage collector may remove inline clones we eliminate during
2428 optimization. So we must be sure to not reference them. */
2429 for (i = 0; i < order_pos; i++)
2430 if (order[i]->process)
2431 {
2432 if (order[i]->tp_first_run
2433 && opt_for_fn (order[i]->decl, flag_profile_reorder_functions))
2434 tp_first_run_order[tp_first_run_order_pos++] = order[i];
2435 else
2436 order[new_order_pos++] = order[i];
2437 }
2438
2439 /* First output functions with time profile in specified order. */
2440 qsort (tp_first_run_order, tp_first_run_order_pos,
2441 sizeof (cgraph_node *), tp_first_run_node_cmp);
2442 for (i = 0; i < tp_first_run_order_pos; i++)
2443 {
2444 node = tp_first_run_order[i];
2445
2446 if (node->process)
2447 {
2448 expanded_func_count++;
2449 profiled_func_count++;
2450
2451 if (symtab->dump_file)
2452 fprintf (symtab->dump_file,
2453 "Time profile order in expand_all_functions:%s:%d\n",
2454 node->dump_asm_name (), node->tp_first_run);
2455 node->process = 0;
2456 node->expand ();
2457 }
2458 }
2459
2460 /* Output functions in RPO so callees get optimized before callers. This
2461 makes ipa-ra and other propagators to work.
2462 FIXME: This is far from optimal code layout. */
2463 for (i = new_order_pos - 1; i >= 0; i--)
2464 {
2465 node = order[i];
2466
2467 if (node->process)
2468 {
2469 expanded_func_count++;
2470 node->process = 0;
2471 node->expand ();
2472 }
2473 }
2474
2475 if (dump_file)
2476 fprintf (dump_file, "Expanded functions with time profile (%s):%u/%u\n",
2477 main_input_filename, profiled_func_count, expanded_func_count);
2478
2479 if (symtab->dump_file && tp_first_run_order_pos)
2480 fprintf (symtab->dump_file, "Expanded functions with time profile:%u/%u\n",
2481 profiled_func_count, expanded_func_count);
2482
2483 symtab->process_new_functions ();
2484 free_gimplify_stack ();
2485
2486 free (order);
2487 }
2488
2489 /* This is used to sort the node types by the cgraph order number. */
2490
2491 enum cgraph_order_sort_kind
2492 {
2493 ORDER_UNDEFINED = 0,
2494 ORDER_FUNCTION,
2495 ORDER_VAR,
2496 ORDER_VAR_UNDEF,
2497 ORDER_ASM
2498 };
2499
2500 struct cgraph_order_sort
2501 {
2502 enum cgraph_order_sort_kind kind;
2503 union
2504 {
2505 cgraph_node *f;
2506 varpool_node *v;
2507 asm_node *a;
2508 } u;
2509 };
2510
2511 /* Output all functions, variables, and asm statements in the order
2512 according to their order fields, which is the order in which they
2513 appeared in the file. This implements -fno-toplevel-reorder. In
2514 this mode we may output functions and variables which don't really
2515 need to be output. */
2516
2517 static void
2518 output_in_order (void)
2519 {
2520 int max;
2521 cgraph_order_sort *nodes;
2522 int i;
2523 cgraph_node *pf;
2524 varpool_node *pv;
2525 asm_node *pa;
2526 max = symtab->order;
2527 nodes = XCNEWVEC (cgraph_order_sort, max);
2528
2529 FOR_EACH_DEFINED_FUNCTION (pf)
2530 {
2531 if (pf->process && !pf->thunk.thunk_p && !pf->alias)
2532 {
2533 if (!pf->no_reorder)
2534 continue;
2535 i = pf->order;
2536 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2537 nodes[i].kind = ORDER_FUNCTION;
2538 nodes[i].u.f = pf;
2539 }
2540 }
2541
2542 /* There is a similar loop in symbol_table::output_variables.
2543 Please keep them in sync. */
2544 FOR_EACH_VARIABLE (pv)
2545 {
2546 if (!pv->no_reorder)
2547 continue;
2548 if (DECL_HARD_REGISTER (pv->decl)
2549 || DECL_HAS_VALUE_EXPR_P (pv->decl))
2550 continue;
2551 i = pv->order;
2552 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2553 nodes[i].kind = pv->definition ? ORDER_VAR : ORDER_VAR_UNDEF;
2554 nodes[i].u.v = pv;
2555 }
2556
2557 for (pa = symtab->first_asm_symbol (); pa; pa = pa->next)
2558 {
2559 i = pa->order;
2560 gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
2561 nodes[i].kind = ORDER_ASM;
2562 nodes[i].u.a = pa;
2563 }
2564
2565 /* In toplevel reorder mode we output all statics; mark them as needed. */
2566
2567 for (i = 0; i < max; ++i)
2568 if (nodes[i].kind == ORDER_VAR)
2569 nodes[i].u.v->finalize_named_section_flags ();
2570
2571 for (i = 0; i < max; ++i)
2572 {
2573 switch (nodes[i].kind)
2574 {
2575 case ORDER_FUNCTION:
2576 nodes[i].u.f->process = 0;
2577 nodes[i].u.f->expand ();
2578 break;
2579
2580 case ORDER_VAR:
2581 nodes[i].u.v->assemble_decl ();
2582 break;
2583
2584 case ORDER_VAR_UNDEF:
2585 assemble_undefined_decl (nodes[i].u.v->decl);
2586 break;
2587
2588 case ORDER_ASM:
2589 assemble_asm (nodes[i].u.a->asm_str);
2590 break;
2591
2592 case ORDER_UNDEFINED:
2593 break;
2594
2595 default:
2596 gcc_unreachable ();
2597 }
2598 }
2599
2600 symtab->clear_asm_symbols ();
2601
2602 free (nodes);
2603 }
2604
2605 static void
2606 ipa_passes (void)
2607 {
2608 gcc::pass_manager *passes = g->get_passes ();
2609
2610 set_cfun (NULL);
2611 current_function_decl = NULL;
2612 gimple_register_cfg_hooks ();
2613 bitmap_obstack_initialize (NULL);
2614
2615 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_START, NULL);
2616
2617 if (!in_lto_p)
2618 {
2619 execute_ipa_pass_list (passes->all_small_ipa_passes);
2620 if (seen_error ())
2621 return;
2622 }
2623
2624 /* This extra symtab_remove_unreachable_nodes pass tends to catch some
2625 devirtualization and other changes where removal iterate. */
2626 symtab->remove_unreachable_nodes (symtab->dump_file);
2627
2628 /* If pass_all_early_optimizations was not scheduled, the state of
2629 the cgraph will not be properly updated. Update it now. */
2630 if (symtab->state < IPA_SSA)
2631 symtab->state = IPA_SSA;
2632
2633 if (!in_lto_p)
2634 {
2635 /* Generate coverage variables and constructors. */
2636 coverage_finish ();
2637
2638 /* Process new functions added. */
2639 set_cfun (NULL);
2640 current_function_decl = NULL;
2641 symtab->process_new_functions ();
2642
2643 execute_ipa_summary_passes
2644 ((ipa_opt_pass_d *) passes->all_regular_ipa_passes);
2645 }
2646
2647 /* Some targets need to handle LTO assembler output specially. */
2648 if (flag_generate_lto || flag_generate_offload)
2649 targetm.asm_out.lto_start ();
2650
2651 if (!in_lto_p
2652 || flag_incremental_link == INCREMENTAL_LINK_LTO)
2653 {
2654 if (!quiet_flag)
2655 fprintf (stderr, "Streaming LTO\n");
2656 if (g->have_offload)
2657 {
2658 section_name_prefix = OFFLOAD_SECTION_NAME_PREFIX;
2659 lto_stream_offload_p = true;
2660 ipa_write_summaries ();
2661 lto_stream_offload_p = false;
2662 }
2663 if (flag_lto)
2664 {
2665 section_name_prefix = LTO_SECTION_NAME_PREFIX;
2666 lto_stream_offload_p = false;
2667 ipa_write_summaries ();
2668 }
2669 }
2670
2671 if (flag_generate_lto || flag_generate_offload)
2672 targetm.asm_out.lto_end ();
2673
2674 if (!flag_ltrans
2675 && ((in_lto_p && flag_incremental_link != INCREMENTAL_LINK_LTO)
2676 || !flag_lto || flag_fat_lto_objects))
2677 execute_ipa_pass_list (passes->all_regular_ipa_passes);
2678 invoke_plugin_callbacks (PLUGIN_ALL_IPA_PASSES_END, NULL);
2679
2680 bitmap_obstack_release (NULL);
2681 }
2682
2683
2684 /* Return string alias is alias of. */
2685
2686 static tree
2687 get_alias_symbol (tree decl)
2688 {
2689 tree alias = lookup_attribute ("alias", DECL_ATTRIBUTES (decl));
2690 return get_identifier (TREE_STRING_POINTER
2691 (TREE_VALUE (TREE_VALUE (alias))));
2692 }
2693
2694
2695 /* Weakrefs may be associated to external decls and thus not output
2696 at expansion time. Emit all necessary aliases. */
2697
2698 void
2699 symbol_table::output_weakrefs (void)
2700 {
2701 symtab_node *node;
2702 FOR_EACH_SYMBOL (node)
2703 if (node->alias
2704 && !TREE_ASM_WRITTEN (node->decl)
2705 && node->weakref)
2706 {
2707 tree target;
2708
2709 /* Weakrefs are special by not requiring target definition in current
2710 compilation unit. It is thus bit hard to work out what we want to
2711 alias.
2712 When alias target is defined, we need to fetch it from symtab reference,
2713 otherwise it is pointed to by alias_target. */
2714 if (node->alias_target)
2715 target = (DECL_P (node->alias_target)
2716 ? DECL_ASSEMBLER_NAME (node->alias_target)
2717 : node->alias_target);
2718 else if (node->analyzed)
2719 target = DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl);
2720 else
2721 {
2722 gcc_unreachable ();
2723 target = get_alias_symbol (node->decl);
2724 }
2725 do_assemble_alias (node->decl, target);
2726 }
2727 }
2728
2729 /* Perform simple optimizations based on callgraph. */
2730
2731 void
2732 symbol_table::compile (void)
2733 {
2734 if (seen_error ())
2735 return;
2736
2737 symtab_node::checking_verify_symtab_nodes ();
2738
2739 timevar_push (TV_CGRAPHOPT);
2740 if (pre_ipa_mem_report)
2741 dump_memory_report ("Memory consumption before IPA");
2742 if (!quiet_flag)
2743 fprintf (stderr, "Performing interprocedural optimizations\n");
2744 state = IPA;
2745
2746 /* If LTO is enabled, initialize the streamer hooks needed by GIMPLE. */
2747 if (flag_generate_lto || flag_generate_offload)
2748 lto_streamer_hooks_init ();
2749
2750 /* Don't run the IPA passes if there was any error or sorry messages. */
2751 if (!seen_error ())
2752 {
2753 timevar_start (TV_CGRAPH_IPA_PASSES);
2754 ipa_passes ();
2755 timevar_stop (TV_CGRAPH_IPA_PASSES);
2756 }
2757 /* Do nothing else if any IPA pass found errors or if we are just streaming LTO. */
2758 if (seen_error ()
2759 || ((!in_lto_p || flag_incremental_link == INCREMENTAL_LINK_LTO)
2760 && flag_lto && !flag_fat_lto_objects))
2761 {
2762 timevar_pop (TV_CGRAPHOPT);
2763 return;
2764 }
2765
2766 global_info_ready = true;
2767 if (dump_file)
2768 {
2769 fprintf (dump_file, "Optimized ");
2770 symtab->dump (dump_file);
2771 }
2772 if (post_ipa_mem_report)
2773 dump_memory_report ("Memory consumption after IPA");
2774 timevar_pop (TV_CGRAPHOPT);
2775
2776 /* Output everything. */
2777 switch_to_section (text_section);
2778 (*debug_hooks->assembly_start) ();
2779 if (!quiet_flag)
2780 fprintf (stderr, "Assembling functions:\n");
2781 symtab_node::checking_verify_symtab_nodes ();
2782
2783 bitmap_obstack_initialize (NULL);
2784 execute_ipa_pass_list (g->get_passes ()->all_late_ipa_passes);
2785 bitmap_obstack_release (NULL);
2786 mark_functions_to_output ();
2787
2788 /* When weakref support is missing, we automatically translate all
2789 references to NODE to references to its ultimate alias target.
2790 The renaming mechanism uses flag IDENTIFIER_TRANSPARENT_ALIAS and
2791 TREE_CHAIN.
2792
2793 Set up this mapping before we output any assembler but once we are sure
2794 that all symbol renaming is done.
2795
2796 FIXME: All this ugliness can go away if we just do renaming at gimple
2797 level by physically rewriting the IL. At the moment we can only redirect
2798 calls, so we need infrastructure for renaming references as well. */
2799 #ifndef ASM_OUTPUT_WEAKREF
2800 symtab_node *node;
2801
2802 FOR_EACH_SYMBOL (node)
2803 if (node->alias
2804 && lookup_attribute ("weakref", DECL_ATTRIBUTES (node->decl)))
2805 {
2806 IDENTIFIER_TRANSPARENT_ALIAS
2807 (DECL_ASSEMBLER_NAME (node->decl)) = 1;
2808 TREE_CHAIN (DECL_ASSEMBLER_NAME (node->decl))
2809 = (node->alias_target ? node->alias_target
2810 : DECL_ASSEMBLER_NAME (node->get_alias_target ()->decl));
2811 }
2812 #endif
2813
2814 state = EXPANSION;
2815
2816 /* Output first asm statements and anything ordered. The process
2817 flag is cleared for these nodes, so we skip them later. */
2818 output_in_order ();
2819
2820 timevar_start (TV_CGRAPH_FUNC_EXPANSION);
2821 expand_all_functions ();
2822 timevar_stop (TV_CGRAPH_FUNC_EXPANSION);
2823
2824 output_variables ();
2825
2826 process_new_functions ();
2827 state = FINISHED;
2828 output_weakrefs ();
2829
2830 if (dump_file)
2831 {
2832 fprintf (dump_file, "\nFinal ");
2833 symtab->dump (dump_file);
2834 }
2835 if (!flag_checking)
2836 return;
2837 symtab_node::verify_symtab_nodes ();
2838 /* Double check that all inline clones are gone and that all
2839 function bodies have been released from memory. */
2840 if (!seen_error ())
2841 {
2842 cgraph_node *node;
2843 bool error_found = false;
2844
2845 FOR_EACH_DEFINED_FUNCTION (node)
2846 if (node->inlined_to
2847 || gimple_has_body_p (node->decl))
2848 {
2849 error_found = true;
2850 node->debug ();
2851 }
2852 if (error_found)
2853 internal_error ("nodes with unreleased memory found");
2854 }
2855 }
2856
2857 /* Earlydebug dump file, flags, and number. */
2858
2859 static int debuginfo_early_dump_nr;
2860 static FILE *debuginfo_early_dump_file;
2861 static dump_flags_t debuginfo_early_dump_flags;
2862
2863 /* Debug dump file, flags, and number. */
2864
2865 static int debuginfo_dump_nr;
2866 static FILE *debuginfo_dump_file;
2867 static dump_flags_t debuginfo_dump_flags;
2868
2869 /* Register the debug and earlydebug dump files. */
2870
2871 void
2872 debuginfo_early_init (void)
2873 {
2874 gcc::dump_manager *dumps = g->get_dumps ();
2875 debuginfo_early_dump_nr = dumps->dump_register (".earlydebug", "earlydebug",
2876 "earlydebug", DK_tree,
2877 OPTGROUP_NONE,
2878 false);
2879 debuginfo_dump_nr = dumps->dump_register (".debug", "debug",
2880 "debug", DK_tree,
2881 OPTGROUP_NONE,
2882 false);
2883 }
2884
2885 /* Initialize the debug and earlydebug dump files. */
2886
2887 void
2888 debuginfo_init (void)
2889 {
2890 gcc::dump_manager *dumps = g->get_dumps ();
2891 debuginfo_dump_file = dump_begin (debuginfo_dump_nr, NULL);
2892 debuginfo_dump_flags = dumps->get_dump_file_info (debuginfo_dump_nr)->pflags;
2893 debuginfo_early_dump_file = dump_begin (debuginfo_early_dump_nr, NULL);
2894 debuginfo_early_dump_flags
2895 = dumps->get_dump_file_info (debuginfo_early_dump_nr)->pflags;
2896 }
2897
2898 /* Finalize the debug and earlydebug dump files. */
2899
2900 void
2901 debuginfo_fini (void)
2902 {
2903 if (debuginfo_dump_file)
2904 dump_end (debuginfo_dump_nr, debuginfo_dump_file);
2905 if (debuginfo_early_dump_file)
2906 dump_end (debuginfo_early_dump_nr, debuginfo_early_dump_file);
2907 }
2908
2909 /* Set dump_file to the debug dump file. */
2910
2911 void
2912 debuginfo_start (void)
2913 {
2914 set_dump_file (debuginfo_dump_file);
2915 }
2916
2917 /* Undo setting dump_file to the debug dump file. */
2918
2919 void
2920 debuginfo_stop (void)
2921 {
2922 set_dump_file (NULL);
2923 }
2924
2925 /* Set dump_file to the earlydebug dump file. */
2926
2927 void
2928 debuginfo_early_start (void)
2929 {
2930 set_dump_file (debuginfo_early_dump_file);
2931 }
2932
2933 /* Undo setting dump_file to the earlydebug dump file. */
2934
2935 void
2936 debuginfo_early_stop (void)
2937 {
2938 set_dump_file (NULL);
2939 }
2940
2941 /* Analyze the whole compilation unit once it is parsed completely. */
2942
2943 void
2944 symbol_table::finalize_compilation_unit (void)
2945 {
2946 timevar_push (TV_CGRAPH);
2947
2948 /* If we're here there's no current function anymore. Some frontends
2949 are lazy in clearing these. */
2950 current_function_decl = NULL;
2951 set_cfun (NULL);
2952
2953 /* Do not skip analyzing the functions if there were errors, we
2954 miss diagnostics for following functions otherwise. */
2955
2956 /* Emit size functions we didn't inline. */
2957 finalize_size_functions ();
2958
2959 /* Mark alias targets necessary and emit diagnostics. */
2960 handle_alias_pairs ();
2961
2962 if (!quiet_flag)
2963 {
2964 fprintf (stderr, "\nAnalyzing compilation unit\n");
2965 fflush (stderr);
2966 }
2967
2968 if (flag_dump_passes)
2969 dump_passes ();
2970
2971 /* Gimplify and lower all functions, compute reachability and
2972 remove unreachable nodes. */
2973 analyze_functions (/*first_time=*/true);
2974
2975 /* Mark alias targets necessary and emit diagnostics. */
2976 handle_alias_pairs ();
2977
2978 /* Gimplify and lower thunks. */
2979 analyze_functions (/*first_time=*/false);
2980
2981 /* Offloading requires LTO infrastructure. */
2982 if (!in_lto_p && g->have_offload)
2983 flag_generate_offload = 1;
2984
2985 if (!seen_error ())
2986 {
2987 /* Emit early debug for reachable functions, and by consequence,
2988 locally scoped symbols. */
2989 struct cgraph_node *cnode;
2990 FOR_EACH_FUNCTION_WITH_GIMPLE_BODY (cnode)
2991 (*debug_hooks->early_global_decl) (cnode->decl);
2992
2993 /* Clean up anything that needs cleaning up after initial debug
2994 generation. */
2995 debuginfo_early_start ();
2996 (*debug_hooks->early_finish) (main_input_filename);
2997 debuginfo_early_stop ();
2998 }
2999
3000 /* Finally drive the pass manager. */
3001 compile ();
3002
3003 timevar_pop (TV_CGRAPH);
3004 }
3005
3006 /* Reset all state within cgraphunit.c so that we can rerun the compiler
3007 within the same process. For use by toplev::finalize. */
3008
3009 void
3010 cgraphunit_c_finalize (void)
3011 {
3012 gcc_assert (cgraph_new_nodes.length () == 0);
3013 cgraph_new_nodes.truncate (0);
3014
3015 vtable_entry_type = NULL;
3016 queued_nodes = &symtab_terminator;
3017
3018 first_analyzed = NULL;
3019 first_analyzed_var = NULL;
3020 }
3021
3022 /* Creates a wrapper from cgraph_node to TARGET node. Thunk is used for this
3023 kind of wrapper method. */
3024
3025 void
3026 cgraph_node::create_wrapper (cgraph_node *target)
3027 {
3028 /* Preserve DECL_RESULT so we get right by reference flag. */
3029 tree decl_result = DECL_RESULT (decl);
3030
3031 /* Remove the function's body but keep arguments to be reused
3032 for thunk. */
3033 release_body (true);
3034 reset ();
3035
3036 DECL_UNINLINABLE (decl) = false;
3037 DECL_RESULT (decl) = decl_result;
3038 DECL_INITIAL (decl) = NULL;
3039 allocate_struct_function (decl, false);
3040 set_cfun (NULL);
3041
3042 /* Turn alias into thunk and expand it into GIMPLE representation. */
3043 definition = true;
3044
3045 memset (&thunk, 0, sizeof (cgraph_thunk_info));
3046 thunk.thunk_p = true;
3047 create_edge (target, NULL, count);
3048 callees->can_throw_external = !TREE_NOTHROW (target->decl);
3049
3050 tree arguments = DECL_ARGUMENTS (decl);
3051
3052 while (arguments)
3053 {
3054 TREE_ADDRESSABLE (arguments) = false;
3055 arguments = TREE_CHAIN (arguments);
3056 }
3057
3058 expand_thunk (false, true);
3059
3060 /* Inline summary set-up. */
3061 analyze ();
3062 inline_analyze_function (this);
3063 }
3064
3065 #include "gt-cgraphunit.h"