Dictionary    Maps    Thesaurus    Translate    Advanced >   


Tip: Click Thesaurus above for synonyms. Also, follow synonym links within the dictionary to find definitions from other sources.

1. The Jargon File (version 4.4.7, 29 Dec 2003)
vaxocentrism
 /vak`soh?sen'trizm/, n.

    [analogy with ?ethnocentrism?] A notional disease said to afflict C
    programmers who persist in coding according to certain assumptions that are
    valid (esp. under Unix) on VAXen but false elsewhere. Among these are:

     1. The assumption that dereferencing a null pointer is safe because it is
        all bits 0, and location 0 is readable and 0. Problem: this may instead
        cause an illegal-address trap on non-VAXen, and even on VAXen under
        OSes other than BSD Unix. Usually this is an implicit assumption of
        sloppy code (forgetting to check the pointer before using it), rather
        than deliberate exploitation of a misfeature.

     2. The assumption that characters are signed.

     3. The assumption that a pointer to any one type can freely be cast into a
        pointer to any other type. A stronger form of this is the assumption
        that all pointers are the same size and format, which means you don't
        have to worry about getting the casts or types correct in calls.
        Problem: this fails on word-oriented machines or others with multiple
        pointer formats.

     4. The assumption that the parameters of a routine are stored in memory,
        on a stack, contiguously, and in strictly ascending or descending
        order. Problem: this fails on many RISC architectures.

     5. The assumption that pointer and integer types are the same size, and
        that pointers can be stuffed into integer variables (and vice-versa)
        and drawn back out without being truncated or mangled. Problem: this
        fails on segmented architectures or word-oriented machines with funny
        pointer formats.

     6. The assumption that a data type of any size may begin at any byte
        address in memory (for example, that you can freely construct and
        dereference a pointer to a word- or greater-sized object at an odd char
        address). Problem: this fails on many (esp. RISC) architectures better
        optimized for HLL execution speed, and can cause an illegal address
        fault or bus error.

     7. The (related) assumption that there is no padding at the end of types
        and that in an array you can thus step right from the last byte of a
        previous component to the first byte of the next one. This is not only
        machine- but compiler-dependent.

     8. The assumption that memory address space is globally flat and that the
        array reference foo[-1] is necessarily valid. Problem: this fails at 0,
        or other places on segment-addressed machines like Intel chips (yes,
        segmentation is universally considered a brain-damaged way to design
        machines (see moby), but that is a separate issue).

     9. The assumption that objects can be arbitrarily large with no special
        considerations. Problem: this fails on segmented architectures and
        under non-virtual-addressing environments.

    10. The assumption that the stack can be as large as memory. Problem: this
        fails on segmented architectures or almost anything else without
        virtual addressing and a paged stack.

    11. The assumption that bits and addressable units within an object are
        ordered in the same way and that this order is a constant of nature.
        Problem: this fails on big-endian machines.

    12. The assumption that it is meaningful to compare pointers to different
        objects not located within the same array, or to objects of different
        types. Problem: the former fails on segmented architectures, the latter
        on word-oriented machines or others with multiple pointer formats.

    13. The assumption that an int is 32 bits, or (nearly equivalently) the
        assumption that sizeof(int) == sizeof(long). Problem: this fails on 
        PDP-11s, 286-based systems and even on 386 and 68000 systems under
        some compilers (and on 64-bit systems like the Alpha, of course).

    14. The assumption that argv[] is writable. Problem: this fails in many
        embedded-systems C environments and even under a few flavors of Unix.

    Note that a programmer can validly be accused of vaxocentrism even if he or
    she has never seen a VAX. Some of these assumptions (esp. 2--5) were
    valid on the PDP-11, the original C machine, and became endemic years
    before the VAX. The terms vaxocentricity and all-the-world's-a-VAX syndrome
    have been used synonymously.


2. The Free On-line Dictionary of Computing (30 December 2018)
vaxocentrism

   /vak"soh-sen"trizm/ [analogy with "ethnocentrism"] A notional
   disease said to afflict C programmers who persist in coding
   according to certain assumptions that are valid (especially
   under Unix) on VAXen but false elsewhere. Among these are:

   1. The assumption that dereferencing a null pointer is safe
   because it is all bits 0, and location 0 is readable and 0.
   Problem: this may instead cause an illegal-address trap on
   non-VAXen, and even on VAXen under OSes other than BSD Unix.
   Usually this is an implicit assumption of sloppy code
   (forgetting to check the pointer before using it), rather than
   deliberate exploitation of a misfeature.

   2. The assumption that characters are signed.

   3. The assumption that a pointer to any one type can freely be
   cast into a pointer to any other type.  A stronger form of
   this is the assumption that all pointers are the same size and
   format, which means you don't have to worry about getting the
   casts or types correct in calls.  Problem: this fails on
   word-oriented machines or others with multiple pointer
   formats.

   4. The assumption that the parameters of a routine are stored
   in memory, on a stack, contiguously, and in strictly ascending
   or descending order.  Problem: this fails on many RISC
   architectures.

   5. The assumption that pointer and integer types are the same
   size, and that pointers can be stuffed into integer variables
   (and vice-versa) and drawn back out without being truncated or
   mangled.  Problem: this fails on segmented architectures or
   word-oriented machines with funny pointer formats.

   6. The assumption that a data type of any size may begin at
   any byte address in memory (for example, that you can freely
   construct and dereference a pointer to a word- or
   greater-sized object at an odd char address).  Problem: this
   fails on many (especially RISC) architectures better optimised
   for HLL execution speed, and can cause an illegal address
   fault or bus error.

   7. The (related) assumption that there is no padding at the
   end of types and that in an array you can thus step right from
   the last byte of a previous component to the first byte of the
   next one.  This is not only machine- but compiler-dependent.

   8. The assumption that memory address space is globally flat
   and that the array reference "foo[-1]" is necessarily valid.
   Problem: this fails at 0, or other places on segment-addressed
   machines like Intel chips (yes, segmentation is universally
   considered a brain-damaged way to design machines (see
   moby), but that is a separate issue).

   9. The assumption that objects can be arbitrarily large with
   no special considerations.  Problem: this fails on segmented
   architectures and under non-virtual-addressing environments.

   10. The assumption that the stack can be as large as memory.
   Problem: this fails on segmented architectures or almost
   anything else without virtual addressing and a paged stack.

   11. The assumption that bits and addressable units within an
   object are ordered in the same way and that this order is a
   constant of nature.  Problem: this fails on big-endian
   machines.

   12. The assumption that it is meaningful to compare pointers
   to different objects not located within the same array, or to
   objects of different types.  Problem: the former fails on
   segmented architectures, the latter on word-oriented machines
   or others with multiple pointer formats.

   13. The assumption that an "int" is 32 bits, or (nearly
   equivalently) the assumption that "sizeof(int) ==
   sizeof(long)".  Problem: this fails on PDP-11s, Intel
   80286-based systems and even on Intel 80386 and Motorola
   68000 systems under some compilers.

   14. The assumption that "argv[]" is writable.  Problem: this
   fails in many embedded-systems C environments and even under a
   few flavours of Unix.

   Note that a programmer can validly be accused of vaxocentrism
   even if he or she has never seen a VAX.  Some of these
   assumptions (especially 2--5) were valid on the PDP-11, the
   original C machine, and became endemic years before the VAX.
   The terms "vaxocentricity" and "all-the-world"s-a-VAX
   syndrome' have been used synonymously.

   [Jargon File]


Common Misspellings >
Most Popular Searches: Define Misanthrope, Define Pulchritudinous, Define Happy, Define Veracity, Define Cornucopia, Define Almuerzo, Define Atresic, Define URL, Definitions Of Words, Definition Of Get Up, Definition Of Quid Pro Quo, Definition Of Irreconcilable Differences, Definition Of Word, Synonyms of Repetitive, Synonym Dictionary, Synonym Antonyms. See our main index and map index for more details.

©2011-2024 ZebraWords.com - Define Yourself - The Search for Meanings and Meaning Means I Mean. All content subject to terms and conditions as set out here. Contact Us, peruse our Privacy Policy