+2004-04-29 Kazu Hirata <kazu@cs.umass.edu>
+
+ * builtins.c, cgraph.c, cgraphunit.c, final.c, fold-const.c:
+ Fix comment typos.
+
2004-04-29 Douglas B Rupp <rupp@gnat.com>
* gcc.c (DELETE_IF_ORDINARY): New macro default definition.
/* Otherwise call the wrapper. This should be equivalent for the rest of
compiler, so the code does not diverge, and the wrapper may run the
- code neccesary for keeping the profiling sane. */
+ code necessary for keeping the profiling sane. */
switch (DECL_FUNCTION_CODE (fn))
{
caller.
Each edge has "inline_failed" field. When the field is set to NULL,
- the call will be inlined. When it is non-NULL it contains an reason
- why inlining wasn't performaned.
+ the call will be inlined. When it is non-NULL it contains a reason
+ why inlining wasn't performed.
The varpool data structure:
/* Hash table used to convert declarations into nodes. */
static GTY((param_is (struct cgraph_node))) htab_t cgraph_hash;
-/* We destructivly update callgraph during inlining and thus we need to
- keep information on whether inlining happent separately. */
+/* We destructively update callgraph during inlining and thus we need to
+ keep information on whether inlining happend separately. */
htab_t cgraph_inline_hash;
/* The linked list of cgraph nodes. */
/* This loop may turn out to be performance problem. In such case adding
hashtables into call nodes with very many edges is probably best
- sollution. It is not good idea to add pointer into CALL_EXPR itself
+ solution. It is not good idea to add pointer into CALL_EXPR itself
because we want to make possible having multiple cgraph nodes representing
different clones of the same body before the body is actually cloned. */
for (e = node->callees; e; e= e->next_callee)
eliminated
Reachable extern inline functions we sometimes inlined will be turned into
unanalyzed nodes so they look like for true extern functions to the rest
- of code. Body of such functions is relased via remove_node once the
+ of code. Body of such functions is released via remove_node once the
inline clones are eliminated. */
for (node = cgraph_nodes; node; node = node->next)
{
else
e->callee->global.inlined_to = e->caller;
- /* Recursivly clone all bodies. */
+ /* Recursively clone all bodies. */
for (e = e->callee->callees; e; e = e->next_callee)
if (!e->inline_failed)
cgraph_clone_inlined_nodes (e, duplicate);
struct cgraph_edge *e, *next;
int times = 0;
- /* Look for all calls, mark them inline and clone recursivly
+ /* Look for all calls, mark them inline and clone recursively
all inlined functions. */
for (e = what->callers; e; e = next)
{
/* Return true when inlining WHAT would create recursive inlining.
We call recursive inlining all cases where same function appears more than
- once in the single recusion nest path in the inline graph. */
+ once in the single recursion nest path in the inline graph. */
static bool
cgraph_recursive_inlining_p (struct cgraph_node *to,
case NOTE_INSN_BASIC_BLOCK:
- /* If we are performing the optimization that paritions
+ /* If we are performing the optimization that partitions
basic blocks into hot & cold sections of the .o file,
then at the start of each new basic block, before
beginning to write code for the basic block, we need to
expression, and ARG to `a'. If COND_FIRST_P is nonzero, then the
COND is the first argument to CODE; otherwise (as in the example
given here), it is the second argument. TYPE is the type of the
- original expression. Return NULL_TREE if no simplication is
+ original expression. Return NULL_TREE if no simplification is
possible. */
static tree