Znašel si se na arhivu spletne strani Radia Študent, kjer so zaenkrat dostopni prispevki pred majem 2012. Takrat smo namreč za rojstni dan preklopili na novo spletno stran in prevetrili programsko shemo.

Povezava na novo spletno stran je tole


twitter3
majspejs3
fejsbuk 3
fejsbuk rozna

natisni
Algoritmi - kaj je to ... (4235 bralcev)
Ponedeljek, 25. 7. 2005
Turboslav



Med razmislekom o potrebnih (a ne zadostnih) virih za avtonomijo se vsiljuje pojem "algoritem" kot mehanizem prikrivanja, potujitve, translacije (smisla gibanja; tudi spremembe pomena - da, celo diskurzivnega polja). Matematično povedano gre za prenosno funkcijo (bolj ali manj kompleksno), ki sicer enostavno gibanje spremeni v "nerazumljivo". Stopenj nerazumljivosti je seveda mnogo, zanimajo nas tiste najbolj enostavne, ki še poskrbijo za to, da je rezultat pretvorbe "magičen".

Preprost primer: rokohitrec s premetavanjem dveh žogic ne počne nič posebnega, fascinacija in magija se začneta pri treh žogicah - opazovalcu se razumevanje premeščanja žogic iz roke v roko izmuzne. Torej: nerazumevanje = fascinacija & magija ...



But to begin with...

From Wikipedia, the free encyclopedia.

In mathematics and computer science an algorithm (the word is derived from the name of the Persian mathematician Al-Khwarizmi), is a finite set of well-defined instructions for accomplishing some task which, given an initial state, will terminate in a corresponding recognizable end-state (contrast with heuristic). The concept of an algorithm is often illustrated by the example of a recipe, although many algorithms are much more complex; algorithms often have steps that repeat (iterate) or require decisions (such as logic or comparison) until the task is completed.
Among different paradigms of algorithms we choose this one:

The probabilistic and heuristic paradigm. Algorithms belonging to this class fit the definition of an algorithm more loosely.

  1. Probabilistic algorithms are those that make some choices randomly (or pseudo-randomly); for some problems, it can in fact be proved that the fastest solutions must involve some randomness.
  2. Genetic algorithms attempt to find solutions to problems by mimicking biological evolutionary processes, with a cycle of random mutations yielding successive generations of 'solutions'. Thus, they emulate reproduction and "survival of the fittest". In genetic programming, this approach is extended to algorithms, by regarding the algorithm itself as a 'solution' to a problem. Also there are
  3. heuristic algorithms, whose general purpose is not to find a optimal solution, but an approximate solution where the time or resources to find a perfect solution are not practical. An example of this would be local search, taboo search, or simulated annealingalgorithms, a class of heuristic probabilistic algorithms that vary the solution of a problem by a random amount. The name 'simulated annealing' alludes to the metallurgic term meaning the heating and cooling of metal to achieve freedom from defects. The purpose of the random variance is to find close to globally optimal solutions rather than simply locally optimal ones, the idea being that the random element will be decreased as the algorithm settles down to a solution.

Another way to classify algorithms is by implementation. A recursive algorithm is one that invokes (makes reference to) itself repeatedly until a certain condition matches, which is a method common to functional programming. Algorithms are usually discussed with the assumption that computers execute one instruction of an algorithm at a time. Those computers are sometimes called serial computers. An algorithm designed for such an environment is called a serial algorithm, as opposed to parallel algorithms, which take advantage of computer architectures where several processors can work on a problem at the same time. The various heuristic algorithms would probably also fall into this category, as their name (e.g. a genetic algorithm) describes its implementation.

A subtype of parallel algorithms, distributed algorithms are algorithms designed to work in cluster computing and distributed computing environments where additional concerns over "classical" parallel algorithms need to be addressed.

For our use we will focus on heuristic/parallel algorithms and stay away from computers as much as possible... We will focus on electro-mechanical automata - the so-called mehatronics or even to the purely mechanic systems. We'll search among the BEAM robotics. The pioneer in this field was Walter Grey Walter.

From Wikipedia, the free encyclopedia...
Walter Grey WalterGrey Walter's most famous work was his construction of some of the first electronic autonomous robots. He wanted to prove that rich connections between a small number of brain cells could give rise to very complex behaviors - essentially that the secret of how the brain worked lay in how it was wired up. His first robots, named Elmer and Elsie, were constructed between 1948 and 1949 and were often described as tortoises due to their shape and slow rate of movement - and because they 'taught us' about the secrets of organisation and life.

BEAM robotics (acronym for Biology, Electronics, Aesthetics, and Mechanics) is a style of robotics that uses simple analog circuits instead of a microprocessor. Most BEAM robots are unusually simple in design compared to traditional mobile robots, and trade off flexibility in purpose for robustness of performance.

Unlike many other types of robots controlled by microcontrollers, BEAM robots are built on the principle of using multiple simple behaviors linked directly to sensor systems with little signal conditioning. This design philosophy is closely echoed in the classic book "Vehicles: Experiments in Synthetic Psychology", which through a series of thought experiments explores the development of complex robot behaviors through simple inhibitory/excitory sensor links to the actuators.



Komentarji
komentiraj >>