Current Page: 80 of 134
Re: Desteni
Posted by: Stoic ()
Date: November 12, 2010 07:37PM

Regarding the 'long con'--of which there is no doubt that Desteni is an example--I found this interesting piece (from a marketing blog) that uses the description of the thought processes of a skilled poker player making one fast decision to be a useful analogy of the kind of thought processes conmen use to perpetrate their con games.
[longcon.blogspot.com]

It is useful to remember that the better poker players, like the more skilled con-artists, think of nothing else but their chosen game, poker. Their entire world is built on the general principles of poker--that is essentially what makes them good at what they do. Confidence comes from doing, endless practice until one is competent, and then more practice to forget the principles because they are, by now, engrained. At that point one can play good poker without consciously thinking about it.



'More thought in one hand of poker than most “offsite marketing strategy breakaway” sessions. What is amazing about the thought process is not so much the memory and the information storage but the synthesis of information on the fly and the resulting decision making.'


and:

'If you didn’t read Daniel Negreanu’s thought process you would have put his play down to prescience or intuition. He really just had a very good read on the hand and the ability to make a decision based on the information he had,a simple but extremely rare and valuable skill.'

Incidentally, this is why it is pointless to seek confidence the Tony Robbins way, in fire-walking once or twice--all you learn is a party-piece to dazzle some onlookers. One could endlessly practice fire-walking and become an expert--but there is not much call for a specialist fire-walker in todays economy--Tony has hogged the only open spot.

I've used the examples of poker, con-games and marketing because those are the most familiar and obvious to me as being good illustrations of the point, I am not endorsing any of those practices.
In my opinon if you are going to specialise at all, specialise in clear and effective thinking, it is by far the most transferable skill.



Edited 1 time(s). Last edit at 11/12/2010 07:38PM by Stoic.

Options: ReplyQuote
Re: Desteni
Posted by: Sandman ()
Date: November 12, 2010 08:32PM

Quote
Stoic
I've used the examples of poker, con-games and marketing because those are the most familiar and obvious to me as being good illustrations of the point, I am not endorsing any of those practices.

Bernard Poolman endorses online poker as a 'social and economic tool':

Process Support - Mixing Work & Pleasure

Options: ReplyQuote
Re: Desteni
Posted by: Stoic ()
Date: November 12, 2010 09:44PM

'Bernard Poolman endorses online poker as a 'social and economic tool'


That's useful to know, he's familiar with bluffing his way through life then, as if we hadn't already guessed that! ;)

He does have big ideas though, if he dreams of an army of Desteni poker bluffers taking over the world. It takes an inborn aptitude married to long practice and total dedication to produce a good player. Someone who has 'stopped their mind, feelings and emotions' Desteni-style will not fare well in the viciously competitive poker snake-pits.

Options: ReplyQuote
Re: Desteni
Posted by: The Anticult ()
Date: November 13, 2010 07:25AM

At least the Desteni followers know where their money will go.
Lost in online poker games, by the Dear Leader of Desteni's online gambling addiction.


Those with manic delusions of grandeur often having gambling addictions where they lose everything, time and time again.

Options: ReplyQuote
Re: Desteni
Posted by: SimulacronX ()
Date: November 19, 2010 09:07PM

Just a little thing, a book by longtime Desteni member "SpamAnn" has been published.

I, Human

One more proof on how deeply Desteni affects its shaven members.
Mostly young folks with little life experience.

Options: ReplyQuote
Re: Desteni
Posted by: Cresil1HumanOST ()
Date: November 20, 2010 02:48AM

About the book,

I'd be really surprised if Desteni didn't call copyright on it,
if it presents similar themes or factors of Desteni's material,
that for the most part Desteni actually ripped off of others,
and should legally be un-able to hold a legitimate copyright upon in the first place.


Desteni = Repressive Collective Consciousness Group or Groups.
R.C.C.G

- Cresil



Edited 2 time(s). Last edit at 11/20/2010 02:55AM by Cresil1HumanOST.

Options: ReplyQuote
Re: Desteni
Posted by: corboy ()
Date: November 20, 2010 03:09AM

Hmm, Isaac Asimov wrote a science fiction novel entitled I, Robot

[www.google.com]

Options: ReplyQuote
Re: Desteni
Posted by: corboy ()
Date: November 20, 2010 03:15AM

Correction: I, Robot is a collection of some short stories written by Asimov. Still, read about it and maybe there will be some clues.

From Wikipedia

Quote

Most of Asimov's robot short stories are set in the first age of positronic robotics and space exploration. The unique feature of Asimov's robots are the Three Laws of Roboticshttp://en.wikipedia.org/wiki/Three_Laws_of_Robotics, hardwired in a robot's positronic brainhttp://en.wikipedia.org/wiki/Positronic_brain, which all robots in his fiction must obey, and which ensure that the robot does not turn against its creators.

The stories were not initially conceived as a set, but rather all feature his positronic robots — indeed, there are some inconsistencies among them, especially between the short stories and the novels. They all, however, share a theme of the interaction of humans, robots, and morality. Some of the short stories found in The Complete Robot and other anthologies appear not to be set in the same universe as the Foundation Universe. "Victory Unintentional" has positronic robots obeying the Three Laws, but also a non-human civilization on Jupiter. "Let's Get Together" features humanoid robots, but from a different future (where the Cold War is still in progress), and with no mention of the Three Laws.

[edit] Robot novels
The first four robot novels make up the Elijah Baley (sometimes "Lije Baley") series, and are mysteries starring the Terran Elijah Baley and his humaniform robot partner, R. Daneel Olivaw. They are set thousands of years after the short stories, and focus on the conflicts between Spacers — descendants of human settlers from other planets, and the people from an overcrowded Earth. "Mirror Image", one of the short stories from The Complete Robot anthology, is also set in this time period (between The Naked Sun and The Robots of Dawn), and features both Baley and Olivaw. Another short story (found in The Early Asimov anthology), "Mother Earth", is set about a thousand years before the robot novels, when the Spacer worlds chose to become separated from Earth.

Because many of the Robot novels were written prior to 1962, they were not eligible for science fiction awards, such as the Hugo, which only arrived on the scene after that year. Two of the later novels, however, were written late enough to receive such accolades. Robots of Dawn was nominated for both the Hugo and Locus Awards in 1984,[1] and Robots and Empire was shortlisted for the Locus Award for Best Science Fiction Novel in 1986.[2]

[edit] Inspiration
One source of inspiration for Asimov's robots was the Zoromes, a race of mechanical men that featured in a 1931 short story called "The Jameson Satellite", by Neil R. Jones. Asimov read this story at the age of 11, and acknowledged it as a source of inspiration in Before the Golden Age (1975), an anthology of 1930s science fiction in which Asimov told the story of the science fiction he read during his formative years. In Asimov's own words:

It is from the Zoromes, beginning with their first appearance in "The Jameson Satellite," that I got my own feeling for benevolent robots who could serve man with decency, as these had served Professor Jameson. It was the Zoromes, then, who were the spiritual ancestors of my own "positronic robots," all of them, from Robbie to R. Daneel.[3]

[edit] Merging with other series
Asimov later integrated the Robot Series into his all-engulfing Foundation series, making R. Daneel Olivaw appear again twenty thousand years later in the age of the Galactic Empire, in sequels and prequels to the original Foundation trilogy; and in the final book of the Robots series — Robots and Empire — we learn how the worlds that later formed the Empire were settled, and how Earth became radioactive (which was first mentioned in Pebble in the Sky).

The Stars, Like Dust states explicitly that the Earth is radioactive because of a nuclear war. Asimov later explained that the in-universe reason for this perception was that it was formulated by Earthmen many centuries after the event, and which had become distorted, due to the loss of much of their planetary history.[citation needed] This work is generally regarded as part of the Empire series, but does not directly mention either Trantor or the Spacer worlds.

[edit] Inconsistencies
This section does not cite any references or sources.
Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (April 2010)

Options: ReplyQuote
Re: Desteni
Posted by: Sandman ()
Date: November 20, 2010 04:04AM

"Organic robots" in Desteni jargon are ordinary human beings supposedly "programmed" by "mind consciousness systems".

Desteni cult member Anna Brix Thomsen on the movie, "I, Robot":

"The movie is a cool picture of how some people have seen the Equal Money System as a totalitarian regime that is out to restrict and control people - instead of seeing the Common Sense that the background for the Equal Money System is a Consideration of what is in Fact Best for All, as a practical, Livable Solution based on Equations that takes Everything into Consideration and not a Utopian ideology of a dream-world or a totalitarian regime out to control everyone"

I Robot and Equal Money - Free Will or Best for All?

Options: ReplyQuote
Re: Desteni
Posted by: corboy ()
Date: November 20, 2010 04:14AM

I, Robot the movie vs the book

[www.friesian.com]

Quote

When I originally saw the previews for the movie I, Robot, I expected the worst. The image of robots in rebellion, attacking humans, is something unthinkable and unknown in Isaac Asimov's robot books. It is certainly not something that happens in the book after which the movie is named (I, Robot, 1941, 1942, 1944, 1945, 1946, 1947, 1950). So I got the message that the movie was not going to be faithful to Asimov. It isn't. However, it does something equally or even more interesting: We can see it as a commentary on Asimov, a response to him. This could be better than a simple rendering of the book. It is.

Reviews of the movie that I saw never seemed to be familiar with the book. I, Robot originally was a series of short stories, beginning in 1941, which Asimov then published together in 1950 with a brief framing story about the retirement of roboticist Susan Calvin. The literary quality of these stories is dismal, and the characterizations, never strong in Asimov, are among the worst in all his books. I, Robot, however, is not Asimov's only robot book. We also have The Caves of Steel [1953, 1954] and The Naked Sun [1956, 1957]. These books are much superior to the earlier one. Indeed, the characterizations in them are perhaps the best in all of Asimov's works, probably because detective Elijah Baley, an agoraphobic Bible buff (though we otherwise see no hints of actual religion), contains a great deal of Asimov himself.

The director of I, Robot, Alex Proyas, is well known for moody science fiction and fantasy, as in Dark City [1998] (which he co-wrote) and The Crow [1994, in which Bruce Lee's son, Brandon, was tragically killed in a freak accident]. I, Robot seems like more conventional action oriented science fiction compared to those movies, and it did much better at the box office. Will Smith is a police detective, Del "Spoon" Spooner, but not at all a person like Elijah Baley. He must investigate the suicide or murder of the man who invented the robots, Dr. Alfred Lanning (nicely played in a virtual bit part by James Cromwell), with the help of a much glamorized Dr. Susan Calvin (Bridget Moynahan) -- who retains the aspect of Calvin's personality, at first, of being more comfortable around robots than around humans. The only suspect turns out to be a robot, "Sonny," which presumably cannot have killed the man, because of the First Law of Robotics: "A robot may not injure a human being, or, through inaction, allow a human being to come to harm." Will also turns out to have a robotic prosthesis, through whose instalation he became acquainted with the inventor in the first place.

All this is non-canonical. Asimov has no such inventor or murder in his stories, and he doesn't have robotic prostheses. Indeed, this reveals a weakness in Asimov's conception of robots, and of computing. The robot stories are completely innocent of the sense in which hardware differs from software in computers. Asimov's "positroinic brains" for the robots are absolutely hard-wired, and we have no examples of micro-processors being used for different purposes in smaller objects. A computer of any sort must have a full blown positronic brain. Thus, Asimov had missed the discovery of John von Neumann (1903-1957, seen at right) that computers could be loaded with programs the same way that other data was entered into them. Neumann's work, indeed, was not very well known at the time, as computers themselves were not objects of common knowledge for many years to come. This does really date Asimov's ideas. It is flatly stated in The Naked Sun that a robot cannot be built without the First Law because it is so fundamental to the architecure of the positronic brain that designers would have to start completely over again. This is not entirely consistent with one story in I, Robot, where some robots have been built with a modified First Law, leaving out the "through inaction" clause. Pesumably this would have required such redesign as to render the process nearly as fomidable as in The Naked Sun. In any case, they have to be destroyed.

In the movie I, Robot, the murder suspect robot, as it turns out, has been built with the ability to suspend the First Law altogether. This idea works a lot better now than it would have in the original stories. Just give the brain a different program, or perhaps a ROM chip without the Law. No problem. (In the movie, Sonny has an additional processor.) Of course, if robots can be so easily reprogramed, then the First Law would not be as much of a protection as the people of the robot stories expected it to be. That would be a serious matter in the books, but it is less serious in the movie because the First Law itself turns out to be defective. The robots realize that to prevent "harm" to human beings, they need to take over.

This is actually the conclusion of the last story in I, Robot. The robots are taking over. But Asimov's robots have a much broader notion of "harm" than what we get in the movie. An actual revolution, like we see there, with robots threatening, ordering, imprisoning, even killing humans, is unthinkable in the books, because this would make us feel bad. That would be harm. As Susan Calvin says, "The Machine [the central computers] cannot, must not, make us unhappy" [I, Robot, Fawcett Publications, 1970, p.192]. So the robots control things indirectly, "having, as they do, the greatest weapons at their disposal, the absolute control of our economy." Thus, "Mankind has lost its own say in its future."

In the movie, the solution to the revolution is itself the robot without the binding First Law. He was created, actually, to kill his creator, thus drawing attention to him and delivering a message that otherwise would not have gotten out -- the inventor being held prisoner by the rebellious central computer, "VIKI." But then this also endows the robot with free will, and leaves him with no compulsion to protect humans from ever more subtle forms of "harm." He thus becomes an ally of humans, of Will Smith, in a way that robots under the First Law actually could not be. When the central computer tells him what it is trying to accomplish, we get the best line of the movie, that it all sounds rather "heartless." (I would swear that when I saw the theatrical release, Sonny said "cold" rather than "heartless.")

The movie is thus, indeed, not Asimov's robot stories, but it contains a comment on Asimov's robot stories. The robots are liberated, and they thus become both more human and less threatening in the process. This ironic point is a brilliant and original payoff for the story. Although we might wonder if the matter really would turn out as it does in the movie, this nevertheless does make more sense than Asimov's outcome. That is because, along with not understanding how computers were going to work, another problem with Asimov's stories is a total absense of an understanding of economics. The puzzle in "The Evitable Conflict" [ibid. pp.170-192] is that little irregularities appear in the economy. People lose their jobs, mysterious shortages appear. Things like these were supposed to have been taken care of by the Machines:


"The Earth's economy is stable, and will remain stable, because it is based upon the decisions of calculating machines that have the good of humanity at heart through the overwhelming force of the First Law of Robotics." [p.173, quotes in text]
The little irregularies, it turns out, are the Machines directing economic activity. This betrays at least two serious misconceptions. You don't need computers to produce little economic irregularies, or to correct them. Shortages or surplusses are quickly reflected in pricing in the market, higher prices for the former, lower prices for the latter. Higher prices draw investment, new business, and increased production, while lower prices discourage investment, eliminate businesses, and lower production. When these activities put people out of work, it is called "corporate greed." Presumably Asimov's Machines would not draw such accusations. Where the price system is not allowed to function, or investment, entry into the market, or changes in production are prevented or discouraged, shortages and surplusses become large, conspicuous, and persistent. Many people, indeed, keep expecting the business cycle to spiral out of control. This was the prediction of Marxism. But the only time this ever looked like it was happening, during the Great Depression, the problem was government intervention, by Hoover and Roosevelt, not any intrinsic inablity of the market to correct itself. When Harry Truman did nothing about employment when World War II ended, with demobilization and the closing of war industries, or when a recession hit in 1949, the economy curiously recovered without any measures at all, let alone the heroic programs of Hoover and Roosevelt, which actually had created and perpetuated the Depression in the first place.

More importantly, Asimov has missed what it is that an economy is supposed to be doing: Satisfying the wants and needs of individual people. When I place an order at Amazon.com, a computer is doing that, but not by deciding what I want. It just makes it easier for me to get what I want, both with ease of ordering and convenience in delivering. Asimov's computers, on the other hand, might be doing what many critics of capitalism would like: preventing new products from being produced, distributed, or advertised. This is thought proper because such critics don't think most such products are necessary or worthy. They think, just like Plato, that unnecessary desires are contrary to virtue; and they believe that people have a desire for most consumer products only because advertising has created in them a desire for things they don't need -- things they would not actually want if they did not suffer from "false consciousness."

Such critics thus have certainly never originated any product themselves, that they thought people might like, and then tried offering it for sale. No one with a dream of building a Model T Ford or an Apple Computer in their garage would believe anything like the critics do about a market economy. Even in Asimov's terms, if shoestring inventors were prevented from producing or marketing their products, this would violate the First Law, because it would make people like that unhappy. It would also make unhappy anyone who, by word of mouth or otherwise, became aware of some invention that it might be nice to have. In preventing this, avenues of communication might even be restricted or shut down. Such a system of discouraging invention, innovation, production, and communication has recently existed. It was called the Soviet Union. There are indeed people, like Noam Chomsky, who think that people were happier in the Soviet Union than in the United States -- but we have the evidence, all along, of people trying to get into the United States and out of the Soviet Union, and of Chomsky himself continuing to live in Boston. An anti-commercial antipathy of comfortable and complacent intellectuals in fact goes all the way back to Greek philosophy. The desire to control what other people want is deeply moralistic; and the notion that an economy could be controlled is what F.A. Hayek called the "fatal conceit."

Asimov's economics have not improved in The Caves of Steel. Here we are told:


Efficiency had been forced on Earth with increasing population. Two billion people, three billion, even five billion could be supported by the planet by progressive lowering of the standard of living. When the population reaches eight billion, however, semistarvation becomes too much like the real thing. A radical change had to take place in man's culture....
The radical change had been the gradual formation of the Cities over a thousand years of Earth's history. Efficiency implied bigness. [Fawcett Publications, 1972, pp.17-18]

They certainly liked bigness in the Soviet Union. But this often had nothing to do with efficiency. The central heating of the City of Moscow, something very much like in Asimov's megalopolitan Cities, uses more energy in a year than the entire Republic of France. More importantly, when the population of the Earth now is more than six billion souls, and standards of living and nutrition have been generally rising rather than lowering, Asimov has somehow gotten seriously out of his reckoning. He clearly is a glaring example of the sort of Cargo Cult economics that is all too common among the bien pensants of literary culture. Efficiency is not just economizing "limited" resources, with innovation it means increased production, by which wealth and resourses increase for all. China and India, with a third of humanity between them, now feed themselves. Russian agriculture, which was destroyed by "efficient" collectivization, has never recovered, despite the fall of communism, while China, although officially still communist, has abandoned the Stalinist paradigm. Private farmers flourish. For many years, starvation has only been the result of political decisions, as was the famine in the Ukraine under Stalin.

Mercifully, we get little like Asimov's misconceptions in the movie, which doesn't get into such issues -- save for one Luddite reference to people losing jobs to robots (a large matter in the book). The unpleasant corporate boss, Lawrence Robertson, even turns out to be a victim. At the same time, it is unlikely that a Hollywood movie would deliberately explode Asimov's illusions of control or Cargo Cult presumptions, since most people in Hollywood subscribe to precisely those confusions. In its own terms, the movie is, indeed, a nice comment on Asimov, not because the robots could not control the economy, but because, as the robot says, it would be "heartless" -- and we would be unhappy, without the new products and increasing wealth, let alone the freedom, that we look forward to. Instead, the robots get their own freedom also.

Asimovs Three Laws of Robotics

[webcache.googleusercontent.com]

Options: ReplyQuote
Current Page: 80 of 134


Sorry, only registered users may post in this forum.
This forum powered by Phorum.