Monthly Archives: August 2010

Bacteria for Complex Problem Solving

Facultad de Informática de la Universidad Politécnica de Madrid
“New system using bacterial
communities to solve complex problems”
(Spanish) June 16, 2010
(English précis) ScienceDaily
August 20, 2010

Found at
www.sciencedaily.com  /releases/2010/06/100601072638.htm

Description
Development of methodologies for use in both bacterial computing and synthetic biology. Algorithms are designed around bacterial conjugation (genetic information transfer) and quorum sensing (gene expression based on population density).

Quorum Sensing in Honeybee Swarms

Thomas D. Seeley and
P. Kirk Visscher
“Quorum sensing during nest-site selection by honeybee swarms”
Behavioral Ecology and Sociobiology 2004 56:594–601

Found at
www.culturaapicola.com.ar/apuntes/conducta/70_quorum.pdf

Description
An early test of the quorum sensing hypothesis. Bees have a complex and well-studied messaging system, based on body movement, which enables collective behavior.

Analog Computer

A computer is a device that performs calculations on input data to produce output. The digital computer is the type in common use today. However, there is a much older type – the analog computer. Mechanical analog computers have been used since ancient times. The slide rule (invented in the 1600s) is one example. Electronic analog computers are still fairly widely used today, but mainly for very specific tasks (eg control systems).

In an analog computer, processing is done using continuous, real numbers instead of discrete, digital numbers. Real numbers can represent any value to the precision allowed by the physical device’s tolerances. Thus an analog computer can work on calculus problems directly. While a digital computer requires many simple transistors (each holding a 1 or 0) to store one discrete value, an analog computer can use a single capacitor to store one continuous (real) value. Analog computers can provide good models for the physical world. The mathematics governing masses, springs, fluidic flow, etc can be directly applied to an electronic circuit based on operational amplifiers. Also, all processing is done in parallel as opposed to the sequential nature of digital computers. Output varies with input in nearly ‘real time’, thus making analog computers useful in many control systems.

During World War II, analog computers grew in complexity and power as controllers for weapons systems. However, the advent of the modern electronic digital computer after the war largely led to their obsolescence. Digital computers offer great advantages. Miniaturization allows millions of simple binary transistors to be placed on a single chip. Digital numbers can easily separate mantissa and exponent (scientific notation),  creating virtually unlimited dynamic range. Digital numbers are largely immune to noise or signal loss.

Double-Blind Study

One of the greatest steps forward in science in the last several decades has been the widespread acceptance and use of the double-blind study. This has more to do with human nature than with the scientific study at hand, but science is done by people, so the statement holds. Direct experience of an event may not be as reliable as reasoning about evidence (scientific inference), even after the fact. This is a somewhat surprising notion, and one that has truly profound implications, not just for science, but for any field of testable knowledge.

A double-blind study is so called because neither the subjects nor the researchers administering the test are aware of the objective details of the test. For example, when testing a new drug, neither the patients nor the researchers recording the results know whether the patients are given the real drug being tested or a placebo. Such a study eliminates conscious and unconscious bias on both the parts of the subjects and researchers.

Another example is the analysis of the results of a particle physics experiment. Single-blind is sufficient in this case as the subjects (particles) are not capable of bias. One methodology is to subdivide the analysis into fractions the size of which are unknown to the analysts. They then would have no expectation of particular results for each fraction. Once the fractions have all been analyzed in this blind fashion, they can be assembled systematically and without bias into a complete (double-blind) study. There are several other methodologies used, each one carefully denying enough information to the analysts to ensure their biases are neutralized (knowledge of current theory, knowledge of apparatus used, knowledge of colleagues, etc).

This observer bias can have many causes, not the least of which is the pattern-seeking nature of the human mind. Our experience, expectations, and even desires can alter and augment our perceptions. In addition to observer bias, a well designed double-blind study can also minimize statistical illusions and false cause-effect conclusions.

Genome Sequence Storage

Correspondence: Lincoln D Stein
Source: Ontario Institute for Cancer Research
Genome Biology 2010
11:207 doi:10.1186/gb-2010-11-5-207

Found at
genomebiology.com/2010/11/5/207

Description
Argument for cloud computing as new storage paradigm now that DNA sequencing has become inexpensive and pervasive.