Live variable analysis

Live variable analysis

In computer science, live variable analysis (or simply liveness analysis) is a classic data flow analysis performed by compilers to calculate for each program point the variables that may be potentially read before their next write, that is, the variables that are "live" at the exit from each program point.

Stated simply: a variable is live if it holds a value that may be needed in the future.

It is a "backwards may" analysis. The analysis is done in a backwards order, and the dataflow confluence operator is set union.

L1: b := 3; L2: c := 5; L3: a := b + c; goto L1;The set of live variables at line L2 is {b, c}, but the set of live variables at line L1 isonly {b} since variable c is updated in line 2. The value of variable a is never used,so the variable is never live.

The dataflow equations used for a given basic block "s" and exiting block "f" in live variable analysis are::{ m LIVE}_{ m in} [s] = { m GEN} [s] cup ({ m LIVE}_{ m out} [s] - { m KILL} [s] )

:{ m LIVE}_{ m out} [final] = {emptyset} :{ m LIVE}_{ m out} [s] = igcup_{p in succ [S] } { m LIVE}_{ m in} [p]

:{ m GEN} [d : y leftarrow f(x_1,cdots,x_n)] = {x_1,...,x_n}:{ m KILL} [d : y leftarrow f(x_1,cdots,x_n)] = {y}

The in-state of a block is the set of variables that are live at the start of the block. Its out-state is the set of variables that are live at the end of it. The in-state is the union of the out-states of the block's successors. The transfer function of a statement is applied by making the variables that are written dead, then making the variables that are read live.

// out: {} b1: a = 3; b = 5; d = 4; if a > b then // in: {a,b,d} // out: {a,b} b2: c = a + b; d = 2; // in: {b,d} // out: {b,d} b3: endif c = 4; return b * d + c; // in:{}

The out-state of b3 only contains "b" and "d", since "c" has been written. The in-state of b1 is the union of the out-states of b2 and b3. The definition of "c" in b2 can be removed, since "c" is not live immediately after the statement.

Solving the data flow equations starts with initializing all in-states and out-states to the empty set. The work list is initialized by inserting the exit point (b3) in the work list (typical for backward flow). Its computed out-state differs from the previous one, so its predecessors b1 and b2 are inserted and the process continues. The progress is summarized in the table below.

Note that b1 was entered in the list before b2, which forced processing b1 twice (b1 was re-entered as predecessor of b2). Inserting b2 before b1 would have allowed earlier completion.

Initializing with the empty set is an optimistic initialization: all variables start out as dead. Note that the out-states cannot shrink from one iteration to the next, although the out-state can be smaller that the in-state. This can be seen from the fact that after the first iteration the out-state can only change by a change of the in-state. Since the in-state starts as the empty set, it can only grow in further iterations.

Recently as of 2006, various program analyses such as live variable analysis have been solved using Datalog. The Datalog specifications for such analyses are generally an order of magnitude shorter than their imperative counterparts (e.g. iterative analysis), and are at least as efficient. [cite book | author=Whaley et al. | title=Using Datalog with Binary Decision Diagrams for Program Analysis | year=2004]

References


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Data-flow analysis — is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program s control flow graph (CFG) is used to determine those parts of a program to which a particular value assigned… …   Wikipedia

  • Variable-order Markov model — Variable order Markov (VOM) models are an important class of models that extend the well known Markov chain models. In contrast to the Markov chain models, where each random variable in a sequence with a Markov property depends on a fixed number… …   Wikipedia

  • Dummy variable (statistics) — In statistics and econometrics, particularly in regression analysis, a dummy variable (also known as an indicator variable) is one that takes the values 0 or 1 to indicate the absence or presence of some categorical effect that may be expected to …   Wikipedia

  • Dimensional analysis — In physics and all science, dimensional analysis is a tool to find or check relations among physical quantities by using their dimensions. The dimension of a physical quantity is the combination of the basic physical dimensions (usually mass,… …   Wikipedia

  • Dead and live loads — are terms used in mechanical and structural engineering, especially where analysis of real world objects is required. A load refers to any type of force exerted on an object, which may be in the form of a weight (gravitational force), a pressure …   Wikipedia

  • Spatial analysis — In statistics, spatial analysis or spatial statistics includes any of the formal techniques which study entities using their topological, geometric, or geographic properties. The phrase properly refers to a variety of techniques, many still in… …   Wikipedia

  • Professional practice of behavior analysis — The professional practice of behavior analysis is the fourth domain of behavior analysis. The other three are behaviorism, experimental analysis of behavior, and applied behavior analysis. [Cooper, et al. p. 20] The professional practice of… …   Wikipedia

  • Fox News Live — infobox Television show name = Fox News Live caption = Title card for Fox News Live format = News/Talk program runtime = Variable picture format = 480i camera = Multi camera presenter = Jamie Colby Page Hopkins Eric Shawn country = United States… …   Wikipedia

  • Static single assignment form — In compiler design, static single assignment form (often abbreviated as SSA form or SSA) is an intermediate representation (IR) in which every variable is assigned exactly once. Existing variables in the original IR are split into versions , new… …   Wikipedia

  • Compiler optimization — is the process of tuning the output of a compiler to minimize or maximize some attributes of an executable computer program. The most common requirement is to minimize the time taken to execute a program; a less common one is to minimize the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”