Byte - February 1976

Free download. Book file PDF easily for everyone and every device. You can download and read online Byte - February 1976 file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Byte - February 1976 book. Happy reading Byte - February 1976 Bookeveryone. Download file Free Book PDF Byte - February 1976 at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Byte - February 1976 Pocket Guide.

He was a member of the Homebrew Computer Club and made significant contributions to the software for early microcomputer systems from Tandy Corporation and Cromemco. It appeared in the May Vol 1, No. The interpreter occupied 1. Dobb's Vol.

The first color graphics interface for microcomputers, developed by Cromemco and called the Dazzler , was introduced in with a demonstration program called "Kaleidoscope" written by Wang. He placed a color television in his store window displaying the colorful, ever-changing kaleidoscopic patterns generated by the Dazzler and Wang's software. In a short time the Dazzler had caused a traffic jam on 5th Avenue!

Wang also developed "3K Control Basic" for Cromemco. In general it is possible to construct a CMAC equivalent of any finite state automaton. Of course, CMAC can accept inputs and produce outputs which are nonbinary. Furthermore, the outputs generalize. Thus, CMAC is a sort of "fuzzy state automaton. A Cerebellar Model Arithmetic Computer with direct feedback from output to input demonstrates how a neural cluster can generate a string of outputs subgoals in response to a single input, or unchanging string of inputs.

Additional variables added to F from an external source increase the dimensionality of the input space and can thus alter the output string task decomposition in response to environmental conditions. The different possible feedback pathways to a CMAC control module cast light on a long standing controversy in neurophysiology regarding whether behavior patterns are generated by "stimulus- response chaining" ie: a sequence of actions in which feedback from sensory organs is required to step from one action to the next or by "central -patterning" ie: a sequence which is generated by internal means alone.

A CMAC hierarchy may include tight feedback loops from the output of one level back to its own input to generate central patterns, longer internal loops from one level to another to cycle through a sequence of central patterns, as well as feedback from the environment to select or modify central patterns or their sequence in accordance with environmental conditions. The capability of CMAC to simulate a finite state automaton, to execute the equivalent of a conditional branch, and to compute a broad class of multivarient functions makes it possible to construct the CMAC equivalent of a computer program.

Conversely it is possible to construct a hierarchy of computing modules, perhaps implemented on a network of microprocessors, which is the equivalent of a CMAC hierarchy. This has profound implications re- garding the type of computing architecture which might be used to build a model of the brain for robot control. Each CMAC is a state machine which samples or polls a set of input variables and computes a set of output variables. There is no way that it can be instructed to DO something N times.

But this is not a DO loop in the customary sense. Similarly, one or more of the CMAC input variables can be used to "interrupt" an ongoing trajectory by causing a branch to a new trajectory. A hierarchy of CMACs can return to the interrupt trajectory after a deviation, if the higher level goals remain unchanged throughout the lower level trajectory deviation. This, however, is quite a different mechanism from the interrupt circuitry in the normal computer where a program counter is stored so that program execution can continue after the interrupt has been serviced.

The implication here is that a set of robot control programs modeled after a CMAC hierarchy will include no DO -loops and will not be interrupt driven. Note also that in a CMAC hierarchy, a deviation in a higher level trajectory changes the command string, and hence the program, of all the levels below it.

This implies real time modification of program statements and thus makes the use of a compiler based programming language somewhat cumber- some. A robot control system modeled after a CMAC hierarchy should use some form of an interpretive language where program statements are translated into machine code at execution time.

February 3, 1976 CBS commercials

An interpretive language can, of course, be written in a compiler based language. Also, languages can be devised which are partially compiled and partially interpreted.

Byte (magazine) | Revolvy

We will return to these and other practical issues of computing architecture for robot control at a later time. As was discussed in part 1, any spatial pattern can be represented as a vector. For example, a picture can be represented as an array, or ordered list, of brightness or color values.


  • Welcome to www.piclist.com!.
  • Magazines published in New Hampshire?
  • Unmanned: Drone Warfare and Global Security.

A symbolic character can be represented as an ordered list of features or arbitrary numbers, as in the ASCII convention. Any temporal pattern can be represented as a trajectory through an N- dimensional space. For example, an audio pattern is a sequence of pressure or voltage values ie: a one- dimensional trajectory. A moving picture or television scene corresponds to a sequence of picture vectors ie: an N- dimensional trajectory where N is the number of picture resolution elements or pixels. The fundamental problem of pattern recognition is to name the patterns.

All the patterns with the same name are in the same class. When a pattern has been given a name we say it has been recognized.

For example, when the image of a familiar face falls on my retina and I say to myself "That's George," I have recognized the visual pattern by naming it. At this point we need to introduce some new notation to clearly distinguish between vectors in the sensory processing hierarchy and those in the behavior -generating hierarchy. We can now define a CMAC D vector to represent a sensory pattern plus context such that each component di represents a data point or feature of the pattern plus context.

The existence of the D vector within a particular region of space therefore corresponds to the occurrence of a particular set of features or a particular pattern in a particular context. In other words G can recognize the existence of a particular pattern and context ie: the existence of D in a particular region of input space by outputting the name Q.

This means that, as long as the regions of input space corresponding to pattern classes are reason- ably well separated, the G function can reliably distinguish one region of input space from another and hence classify the corresponding sensory patterns correctly. In the case where the D vector is time dependent, an extended portion of a trajectory TD may map into a single name Q as shown in figure It then is possible by integrating Q over time and thresholding the integral to detect, or recognize, a temporal pattern TD such as a sound or a visual movement.

Navigation menu

Note that the recognition, or naming, of a temporal pattern as illustrated in figure 21 is the inverse of the decomposition of a task as illustrated in figures 14 thru 17 in the previous article in this series. In task decomposition a slowly varying command C is decomposed into a rapidly changing output P. In pattern recognition a rapidly changing sensory experience E is recognized by a slowly varying name Q. It frequently occurs in pattern recognition or signal detection that the instantaneous value of the sensory input vector E is ambiguous or misleading.

This is particularly true in noisy environments or in situations where data dropouts are likely to occur. In such cases the ambiguity can often be resolved or the missing data filled in if the context can be taken into account, or if the classification decision can make use of some additional knowledge or well founded prediction regarding what patterns are expected. The context variables thus can shift the total input pattern vector D to different parts of input space depending on the context.

Thus, as shown in figure 22, the ambiguous patterns E1 and E2, which are too similar to be reliably recognized as being in separate classes, can easily be distinguished when accompanied by context R1 and R2. In the brain, many variables can serve as context variables. In fact, any fiber carrying information about anything occurring simultaneously with the input pattern can be regarded as context.

Thus context can be data from other sensory modalities as well as information regarding what is happening in the behavior -generating hierarchy. In many cases, data from this latter source is particularly relevant to the pattern recognition task, because the sensory input at any instant of time depends heavily upon what action is currently being executed. For example, information from the behavior -generating hierarchy provides contextual information necessary for the visual processing hierarchy to distinguish between motion of the eyes and motion of the room about the eyes.

In a classic experiment, von Holst and Mittelstaedt demonstrated that this kind of contextual data pathway actually exists in insects. They observed that a fly placed in a chamber with rotating walls will tend to turn in the direction of rotation so as to null the visual motion.

By attempting to null the visual motion it was now actually increasing it. Later experiments with motion perception in humans showed that the perception of a stationary environment despite motion of the retinal image caused by moving the eyes is dependent on contextual information derived from the behavior -generating hierarchy. The fact that the context is actually derived from the behavior -generating hierarchy rather than from sensory feedback can be demonstrated by anesthetizing the eye muscles and observing that the effect depends on the intent to move the eyes, and not the physical act of movement.

The perceptual correction occurs even when the eye muscles are paralyzed so that no motion actually results from the conscious intent to move. Contextual information can also provide predictions of what sensory data to expect. This allows the sensory processing modules to do predictive filtering, to compare incoming data with predicted data, and to "flywheel" through noisy data or data dropouts.

The mechanism by which such predictions, or expectations, can be generated is illustrated in figure Here contextual input for the sensory processing hierarchy is shown as being processed through a CMAC M module before being presented to the sensory pattern recognition G modules at each level. Inputs to the M modules derive from the P vector of the corresponding behavior -generating hierarchy at the same level, as well as an X vector which includes context derived from other areas of the brain, such as other sensory modalities or other behavior -generating hierarchies.

Their position in the links from the behavior- generating to the sensory processing hierarchies allows them to function as a predictive memory. They are in a position to store and recall or remember sensory experiences E vector trajectories which occur simultaneously with P and X vector trajectories in the behavior -generating hierarchy and other locations within the brain.

For example, data may be stored in eachMi module by setting the desired output Ri equal to the sensory experience vector Ei. These predictive memory modules thus provide the sensory processing hierarchy with a memory trace of what sensory data occurred on previous occasions when the motor generating hierarchy and other parts of the brain were in similar states along similar trajectories.

Resources:

This provides the sensory processing system with a prediction of what sensory data to expect. What is expected is whatever was experienced during similar activities in the past. In the ideal case, the predictive memory modules Mi will generate an expected sensory data stream TRi which exactly duplicates the observed sensory data stream TEi. To the extent that this occurs in practice it enables the Gi modules to apply very powerful mathematical techniques to the sensory data.

For example, the Gi modules can use the expected data TRi to: Perform cross -correlation or convolution algorithms to detect sync patterns and information bearing sequences buried in noise. Flywheel through data dropouts and noise bursts.

follow

Old computers adverts (528)

Detect or recognize deviations or even omissions from an expected pattern as well as the occurrence of the pattern in its expected form. If we assume, as shown in figure 23, that predictive recall modules exist at all levels of the processing -generating hierarchy, then it is clear that the memory trace itself is multi- leveled. We can say that the predictive memory modules Mi define the brain's internal model of the external world. They provide answers to the question, "If I do this and that, what will happen?

In short, IF I do Y, THEN Z will happen when Z is whatever was stored in predictive memory the last time or some statistical average over N last times that I did Y, and Y is some action such as performing a task or pursuing a goal in a particular environment or situation, which is represented internally by the P vectors at the various different levels of the behavior -generating hierarchy and the X vectors describing the states of various other sensory processing behavior -generating hierarchies.

Practically any kind of knowledge, or set of beliefs, or rules of behavior can be represented as a set of production rules. The CMAC hierarchy shown in figure 23 illustrates how such computational mechanisms can arise in the neurological structure of the brain. We have now completed the second step in our development. I have described a neu- rological model which can store and recall and hence compute a broad class of mathematical functions.