NAME ^

docs/glossary.pod - Parrot Glossary

SUMMARY ^

Short descriptions of words you might need to know that show up in Parrot development.

GLOSSARY ^

Continuations ^

Think of continuations as an execution "context". This context includes everything local to that execution path, not just the stack. It is a snapshot in time (minus global variables). While it is similar to C's setjmp (taking the continuation)/longjmp (invoking the continuation), longjmp'ing only works "down" the stack; jumping "up" the stack (ie, back to a frame that has returned) is bad. Continuations can work either way.

We can do two important things with continuations:

  1. Create and pass a continuation object to a subroutine, which may recursively pass that object up the call chain until, at some point, the continuation can be called/executed to handle the final computation or return value. This is pretty much tail recursion.
  2. Continuations can be taken at an arbitrary call depth, freezing the call chain (context) at that point in time. If we save that continuation object into a variable, we can later reinstate the complete context by its "handle". This allows neat things like backtracking that aren't easily done in conventional stacked languages, such as C. Since continuations represent "branches" in context, it requires an environment that uses some combination of heap-based stacks, stack trees and/or stack copying.

It is common in a system that supports continuations to implement co-routines on top of them.

A continuation is a sort of super-closure. When you take a continuation, it makes a note of the current call stack and lexical scratchpads, along with the current location in the code. When you invoke a continuation, the system drops what it's doing, puts the call stack and scratchpads back, and jumps to the execution point you were at when the continuation was taken. It is, in effect, like you never left that point in your code.

Note that, like with closures, it only puts the *scratchpads* back in scope - it doesn't do anything with the values in the variables that are in those scratchpads.

Co-Routines ^

Co-routines are virtually identical to normal subroutines, except while subroutines always execute from their starting instruction to where they return, co-routines may suspend themselves (or be suspended asynchronously if the language permits) and resume at that point later. We can implement things like "factories" with co-routines. If the co-routine never returns, every time we call it, we "resume" the routine.

A co-routine is a subroutine that can stop in the middle, and start back up later at the point you stopped. For example:

    sub sample : coroutine {
       print "A\n";
       yield;
       print "B\n";
       return;
    }

    sample();
    print "Foo!\n";
    sample();

will print

     A
     Foo!
     B

Basically, the yield keyword says, "Stop here, but the next time we're called, pick up at the next statement." If you return from a co-routine, the next invocation starts back at the beginning. Co-routines remember all their state, local variables, and suchlike things.

COW ^

COW stands for Copy On Write. This is a pure speed-hack technique that copies strings without actually copying the string data until it's absolutely necessary.

If you have a string A, and make a copy of it to get string B, the two strings should be identical, at least to start. With COW, they are, because string A and string B aren't actually two separate strings - they're the same string, marked COW. If either string A or string B are changed, the system notes it and only at that point does it make a copy of the string and change it.

If the program never actually changes the string - something that's fairly common - the program need never make a copy, saving both memory and time.

destruction ^

Destruction is low level memory clean up, such as calling free on malloced memory. This happens after "finalization", and if resources are adequate, may only happen as a side effect of program exit.

DOD ^

Dead Object Detection is the process of sweeping through all the objects, variables, and whatnot inside of Parrot, and deciding which ones are in use and which ones aren't. The ones that aren't in use are then freed up for later reuse. (After they're destroyed, if active destruction is warranted.)

See also: "GC"

finalization ^

Finalization is high-level, user visible cleanup of objects, such as closing an associated DB handle. Finalization reduces active objects down to passive blocks of memory, but does not actually reclaim that memory. Memory is reclaimed by the related "destruction" operation, as and when necessary.

GC ^

Garbage Collection is the process of sweeping through all the active objects, variables, and structures, marking the memory they're using as in use, and all other memory is freed up for later reuse.

Garbage Collection and Dead Object Detection are separate in Parrot, since we generally chew through memory segments faster than we chew through objects. (This is a characteristic peculiar to Perl and other languages that do string processing. Other languages chew through objects faster than memory)

See also: "DOD"

ICU ^

International Components for Unicode

ICU is a C and C++ library that provides support for Unicode on a variety of platforms. It was added to Parrot with the 0.0.8 release to provide future Unicode support.

http://oss.software.ibm.com/icu/index.html

IMC ^

Parrot InterMediate Code, or another name for PIR. A convention once arose to name input files to IMCC with the extension ".imc", so the name IMC became synonymous with PIR. This file extension has since been deprecated.

See also "IMCC", "PIR".

IMCC ^

Parrot's Intermediate Code Compiler, which started its life as an improved Parrot assembler, and eventually became so integrated with Parrot that it became the Parrot executable (being able to load and run PBC files, PASM files or PIR files).

See also "PIR".

JAPH ^

In the Parrot context a JAPH is a small script that prints the string "Just another Parrot Hacker\".

Native Call Interface ^

This let's Parrot talk to native "C" libraries, without a C-compiler.

NCI ^

Acronym for Native Call Interface.

PAST ^

Acronym for Parrot Abstract Syntax Tree.

Packfile ^

Another name for a PBC file, due to the names used for data structures in one of the early implementations in Perl 5.

PBC ^

Parrot Byte Code. The name for the "executable" files that can be passed to the Parrot interpreter for immediate execution (although PASM and IMC files can be executed directly, too).

See also "Packfile".

PDD ^

Acronym for Parrot Design Document. These documents describe the features that the Parrot interpreter should implement.

See also "pdd00_pdd" in pdds.

PIR ^

Parrot Intermediate Representation. A medium-level assembly language for Parrot that hides messy details like register allocation so language compiler writers who target PIR don't have to roll their own.

See also "IMC".

PMC ^

PMC is an acronym for Parrot Magic Cookie. (Or Cracker, your choice.) PMC classes are the primitives that Parrot-based languages use to represent their fundamental types, such as Perl's scalar values.

POD ^

POD is an acronym for Plain Old Documentation. This is for now the preferred form for all kinds of documentation in Parrot. See "perldoc perlpod" for details.

Predereferencing ^

A bytecode transformation technique which reduces the amount of pointer dereferencing done in the inner loop of the interpreter by pre-converting opcode numbers into pointers to their opfuncs, and also converting the register numbers and constant numbers in the arguments to the ops into pointers.

The original implementation by Gregor Purdy was posted on 2001-12-11. On one test system, it resulted in a 22% speed increase on a test program with a tight inner loop.

http://archive.develooper.com/perl6-internals@perl.org/msg06941.html

On 2001-12-18, predereferencing got a speed boost (to about 47% faster than the regular DO_OP inner loop -- without compiler optimizations turned on). This was due to an off-list (actually over lunch) suggestion by John Kennedy that instead of pre-initializing the new copy of the bytecode with NULL pointers, we pre-initialize it with pointers to a pseudo-opfunc that does the predereferencing whenever it is encountered.

On 2002-04-11, Jason Gloudon suggested combining aspects of the Computed Goto Core and the Prederef[erencing] Core.

http://archive.develooper.com/perl6-internals@perl.org/msg07064.html

The week of 2003-02-09, Leopold Toetsch combined Computed Goto and Predereferencing to produce the CGP core.

http://dev.perl.org/perl6/list-summaries/2003/p6summary.2003-02-09.html#Week_of_the_alternative_runloops

Later, on 2003-02-14, Leopold Totsch and Nicholas Clark combined the JIT and the Computed Goto Prederef cores to great effect.

http://www.perl.com/pub/a/2003/02/p6pdigest/20030216.html

run core ^

aka run loop, aka runcore. The way Parrot executes Parrot Byte Code. See running.pod on how to tell parrot which run core should be used.

Vtable ^

A table of operations attached to some data types, such as PMCs and strings. Vtables are used to avoid using switches or long if chains to handle different data types. They're similar to method calls, except that their names are pre-selected.

Warnock's Dilemma ^

The dilemma you face when posting a message to a public forum about something and not even getting an acknowledgment of its existence. This leaves you wondering if your problem is unimportant or previously addressed, if everyone's waiting on someone else to answer you, or if maybe your mail never actually made it to anyone else in the forum.

CORRECTIONS ^

Please send corrections to the perl6-internals mailing list.


parrot