lunes, 27 de diciembre de 2010

Internet Evolution - Michael Mascioni - How Alternate Reality Games Affect Real Life

Es interesante esta exposición de los posibles efectos de los juegos de "realidades alternas" en la presente generación. Los datos inicales revelan que existe un definitvo y real aumento en la capacidad perceptual de las personas que los realizan. Esto plantea que es posible que estemos viendo el surgir de un nuevo tipo de ser humano que puede accesar experiencias sensoriales y considerarlas reales de forma nunca antes experimantada.

Internet Evolution - Michael Mascioni - How Alternate Reality Games Affect Real Life

Michael Mascioni

How Alternate Reality Games Affect Real Life

Written by Michael Mascioni
12/22/2010 Post a comment

Alternate reality games (ARGs) are gaining greater popularity, as entertainment increasingly shifts toward the fusion of real and virtual content and draws on the social experiences of its audiences. Given these trends, it’s unsurprising that ARGs are resonating with broader audiences and influencing culture as well as the world of online games.
New research on ARGs is highlighting their relevance and impact across groups. For example, research studies conducted with players of ARGs and MMORPGs (massively multiplayer online role-playing games) underline the realistic social experiences afforded by these games.
One researcher, Priscilla Haring, who works at the Center for Advanced Media Research at VU University, Amsterdam, conducted interviews and an online survey as part of her master’s thesis in media psychology, with 81 participants, 45 of whom were ARG players and 36 of whom were MMORPG players. The participants were drawn from MMOPRG forums, World of Warcraft players, ARG chat channels, and the ARG Unfunction forum.
Haring found that "ARG players experience a higher perceived reality while in-game compared with MMORPG players," and she attributes this to the "fact that physical reality is already incorporated in ARGs," and to the idea that "players themselves act as platforms, bringing all their experiences and expectations into the games."
To clarify, she defines "perceived reality" as "the degree to which a gaming environment or medium is experienced or has the effect of a real world." She defines "social presence" as "the extent to which players in the gaming environment experience the presence of other players."
The rise of serious ARGs has also acclimated game players to the greater realism ARGs can offer, particularly since those ARGs explicitly invite greater player participation in real-world issues. This is vividly demonstrated by such ARGs as Urgent Evoke, which calls on players to implement change in Africa through entrepreneurial efforts.
Another distinctive element of ARGs is their facility for engendering more social experiences. This could be partially explained by what Haring calls "the collaborative nature of ARG gameplay” and the fact that ARG players can’t “hide behind avatars."
One of the most striking elements of her research concerned the impact both ARGs and MMORPGs have on players’ lives. The research found a "high degree of transference of in-game experiences to real life among both ARG and MMOPRG players."
The combination of an ARGs' "perceived realism" and their capacity to afford transference of in-game experiences to real life make them a very effective "social learning tool," according to Haring.
Clearly, Haring’s research needs to be kept in perspective, as it is based on a small sample. But the results generally correlate well with larger-scale research studies and the track records of major ARGs. It’s becoming increasingly evident that ARGs have significant implications for the future of games and culture, especially with their ability to project their influence further into real life.
— Michael Mascioni is a market research consultant in digital media.

jueves, 23 de diciembre de 2010

OUT OF OUR BRAINS por Andy Clark

Este tema corresponde a Webxistence pero plantea tambien aspectos religiosos de suma importancia.

Actualmente hay un enorme interés por estudiar donde reside la mente, si "dentro" del cuerpo o si puede estar "fuera" del mismo. Esto corresponde a los famosos "out of body experiences" (experiencias fuera del cuerpo) que tanta exposición mediática han logrado. Grandes instituciones están estudiando esto y desde diversos puntos de vista especulan sobre las posibilidades. Es decir, están estudiando esto desde puntos de vista filosóficos, metafísicos, sicológicos y siquiatricos. Algunos incluyen aspectos religiosos pero desde perspectivas tipo "New Age', de filosofías orientales, misticimsos, shamanismos, etc. Pero rehusan entrar en el campo religioso formal o el campo espiritual. Esto, según la mayoria de los estudioso científicos, queda afuera de la ciencia ya que no es demostrable ni repetible.

Se han hecho tomografías computarizadas (CAT SCAN) de monjes meditando y de pentecostales orando y hablando en lenguas y se puede evaluar la tomografía para determina que áreas del cerebro están "activas" en determinado momento. Eso se puede evaluar científicamente y se ha encontrado que, de hecho, hay actividad cerebral clara y evidente en acción. Es decir, algo esta realmente ocurriendo en el cerebro.

Pero de ahí a evaluar si la meditación alcanza algún fin espiritual o si el hablar en lenguas es una comunicación con el espíritu, la ciencia no dice nada. Una cosa es una cosa y otra cosa es otra cosa.

Les invito a leer el artículo y a pensar, meditar y orar.

Cierro con las palabras del Cadenal Carlo Maria Martini citadas por Mons Albacete en Dios en el Ritz: 

"La razón, si se ejerce con diligencia como la búsqueda del sentido y el fín anticipa la oración; de hecho es ya una forma de ella. Yla oración, si se ejerce con igual persistencia y con una mente abierta, abre el espacio para la adoración.

December 12, 2010, 3:47 PM

Out of Our Brains

The StoneThe Stoneis a forum for contemporary philosophers on issues both timely and timeless.
Where is my mind?
The question — memorably posed by rock band the Pixies in their 1988 song — is one that, perhaps surprisingly, divides many of us working in the areas of philosophy of mind and cognitive science. Look at the science columns of your daily newspapers and you could be forgiven for thinking that there is no case to answer. We are all familiar with the colorful “brain blob” pictures that show just where activity (indirectly measured by blood oxygenation level) is concentrated as we attempt to solve different kinds of puzzles: blobs here for thinking of nouns, there for thinking of verbs, over there for solving ethical puzzles of a certain class, and so on, ad blobum. (In fact, the brain blob picture has seemingly been raised to the status of visual art form of late with the publication of a book of high-octane brain images. )
There is no limit, it seems, to the different tasks that elicit subtly, and sometimes not so subtly, different patterns of neural activation. Surely then, all the thinking must be going on in the brain? That, after all, is where the lights are.
As our technologies become better adapted to fit the niche provided by the biological brain, they become more like cognitive prosthetics.
But then again, maybe not. We’ve all heard the story of the drunk searching for his dropped keys under the lone streetlamp at night. When asked why he is looking there, when they could surely be anywhere on the street, he replies, “Because that’s where the light is.” Could it be the same with the blobs?
Is it possible that, sometimes at least, some of the activity that enables us to be the thinking, knowing, agents that we are occurs outside the brain?
The idea sounds outlandish at first. So let’s take a familiar kind of case as a first illustration. Most of us gesture (some of us more wildly than others) when we talk. For many years, it was assumed that this bodily action served at best some expressive purpose, perhaps one of emphasis or illustration. Psychologists and linguists such as Susan Goldin-Meadow and David McNeill have lately questioned this assumption, suspecting that the bodily motions may themselves be playing some kind of active role in our thought process. In experiments where the active use of gesture is inhibited, subjects show decreased performance on various kinds of mental tasks. Now whatever is going on in these cases, the brain is obviously deeply implicated! No one thinks that the physical handwavings are all by themselves the repositories of thoughts or reasoning. But it may be that they are contributing to the thinking and reasoning, perhaps by lessening or otherwise altering the tasks that the brain must perform, and thus helping us to move our own thinking along.
“Brain Cloud (2010)” on display at the Metropolitan Museum of Art in New York as part of a show by John Baldessari,Hiroko Masuike for The New York Times“Brain Cloud (2010)” on display at the Metropolitan Museum of Art in New York as part of a show by John Baldessari.
It is noteworthy, for example, that the use of spontaneous gesture increases when we are actively thinking a problem through, rather than simply rehearsing a known solution. There may be more to so-called “handwaving” than meets the eye.
This kind of idea is currently being explored by a wave of scientists and philosophers working in the areas known as “embodied cognition” and “the extended mind.” Uniting these fields is the thought that evolution and learning don’t give a jot what resources are used to solve a problem. There is no more reason, from the perspective of evolution or learning, to favor the use of a brain-only cognitive strategy than there is to favor the use of canny (but messy, complex, hard-to-understand) combinations of brain, body and world. Brains play a major role, of course. They are the locus of great plasticity and processing power, and will be the key to almost any form of cognitive success. But spare a thought for the many resources whose task-related bursts of activity take place elsewhere, not just in the physical motions of our hands and arms while reasoning, or in the muscles of the dancer or the sports star, but even outside the biological body — in the iPhones, BlackBerrys, laptops and organizers which transform and extend the reach of bare biological processing in so many ways. These blobs of less-celebrated activity may sometimes be best seen, myself and others have argued, as bio-external elements in an extended cognitive process: one that now criss-crosses the conventional boundaries of skin and skull.
One way to see this is to ask yourself how you would categorize the same work were it found to occur “in the head” as part of the neural processing of, say, an alien species. If you’d then have no hesitation in counting the activity as genuine (though non-conscious) cognitive activity, then perhaps it is only some kind of bio-envelope prejudice that stops you counting the same work, when reliably performed outside the head, as a genuine element in your own mental processing?
Another way to approach the idea is by comparison with the use of prosthetic limbs. After a while, a good prosthetic limb functions not as a mere tool but as a non-biological bodily part. Increasingly, the form and structure of such limbs is geared to specific functions (consider the carbon-fiber running blades of the Olympic and Paralympic athlete Oscar Pistorius) and does not replicate the full form and structure of the original biological template. As our information-processing technologies improve and become better and better adapted to fit the niche provided by the biological brain, they become more like cognitive prosthetics: non-biological circuits that come to function as parts of the material underpinnings of minds like ours.
Many people I speak to are perfectly happy with the idea that an implanted piece of non-biological equipment, interfaced to the brain by some kind of directly wired connection, would count (assuming all went well) as providing material support for some of their own cognitive processing. Just as we embrace cochlear implants as genuine but non-biological elements in a sensory circuit, so we might embrace “silicon neurons” performing complex operations as elements in some future form of cognitive repair. But when the emphasis shifts from repair to extension, and from implants with wired interfacing to “explants” with wire-free communication, intuitions sometimes shift. That shift, I want to argue, is unjustified. If we can repair a cognitive function by the use of non-biological circuitry, then we can extend and alter cognitive functions that way too. And if a wired interface is acceptable, then, at least in principle, a wire-free interface (such as links your brain to your notepad, BlackBerry or iPhone) must be acceptable too. What counts is the flow and alteration of information, not the medium through which it moves.
When information flows, some of the most important unities may emerge in regimes that weave together activity in brain, body and world.
Perhaps we are moved simply by the thought that these devices (like prosthetic limbs) are detachable from the rest of the person? Ibn Sina Avicenna, a Persian philosopher-scientist who lived between 980 and 1037 A.D, wrote in the seventh volume of his epic “De Anima (Liber de anima seu sextus de naturalibus)” that “These bodily members are, as it were, no more than garments; which, because they have been attached to us for a long time, we think are us, or parts of us [and] the cause of this is the long period of adherence: we are accustomed to remove clothes and to throw them down, which we are entirely unaccustomed to do with our bodily members” (translation by R. Martin). Much the same is true, I want to say, of our own cognitive circuitry.
The fact that there is a stable biological core that we do not “remove and throw down” blinds us to the fact that minds, like bodies, are collections of parts whose deepest unity consists not in contingent matters of undetachability but in the way they (the parts) function together as effective wholes. When information flows, some of the most important unities may emerge in integrated processing regimes that weave together activity in brain, body, and world.
Such an idea is not new. Versions can be found in the work of James, Heidegger, Bateson, Merleau-Ponty, Dennett, and many others. But we seem to be entering an age in which cognitive prosthetics (which have always been around in one form or another) are displaying a kind of Cambrian explosion of new and potent forms. As the forms proliferate, and some become more entrenched, we might do well to pause and reflect on their nature and status. At the very least, minds like ours are the products not of neural processing alone but of the complex and iterated interplay between brains, bodies, and the many designer environments in which we increasingly live and work.
Please don’t get me wrong. Some of my best friends are neuroscientists and neuro-imagers (as it happens, my partner is a neuro-imager, so brain blobs are part of our daily diet). The brain is a fantastic beast, more than worthy of the massive investments we make to study it. But we — the human beings with versatile bodies living in a complex, increasingly technologized, and heavily self-structured, world — are more fantastic still. Really understanding the mind, if the theorists of embodied and extended cognition are right, will require a lot more than just understanding the brain. Or as the Pixies put it:

Where is my mind?

Way out in the water, see it swimming 

Andy Clark
Andy Clark is professor of logic and metaphysics in the School of Philosophy, Psychology, and Language Sciences at Edinburgh University, Scotland. He is the author of “Being There: Putting Brain, Body, and World Together Again” (MIT Press, 1997) and “Supersizing the Mind: Embodiment, Action, and Cognitive Extension” (Oxford University Press, 2008).

lunes, 13 de diciembre de 2010

Excelente artículo de fondo que plantea la posibilidad de que el filósofo ingles David Hume haya tenido contacto directo con ideas budistas. Esto habia sido descartado por la historia pero este investigador presenta un gran caso a favor del contacto.

Lo sorprendente es la enorme revelación de los alcances que han tenido los Jesuitas en el avance científico , filosófico e intelectual de los siglos recientes. Los Jesuitas, se demuestra en este escrito, son una fuerza huracanada que ha abierto al conocimiento tradiciones escondidas y ocultas por siglos. Al asi hacerlo, les debemos a ellos y su obra mucho de lo que discutimos hoy en el pensamiento humano.

Llama la atención la tradición conflictiva que este estilo de Evangelización ha repercutido en el seno de la Iglesia y los fuertes enfrentamientos con otras órdenes que plantean diferentes derroteros de Evangelización tal como los Capuchinos.

nada, Cirsto dijo que el no venía a unir sino a traer división. Esto es una manera de ver que su palabra es Vida.

Could David Hume Have Known about
Buddhism? Charles Francois Dolu, the
Royal College of La Flèche, and the
Global Jesuit Intellectual Network
Alison Gopnik Aims for Young Readers and Writers -

Estimular el desarrollo de las habilidades de lectura y escritura en las nuevas generaciones es una prioridad si queremos lograr un futuro con posibilidades. Este "site' promueve eso, que los jóvenes se expongan y expongan a su vez, la producción de sus ideas de forma verbalizada y escrita.

Vale la pena tenerlo accesible y reflexionar sobre lo que se produce aquí. Aims for Young Readers and Writers -

Web Site for Teenagers With Literary Leanings

When Jacob Lewis helped create the beta version of the Web site Figment with Dana Goodyear, a staff writer at The New Yorker, Mr. Lewis envisioned it as a sort of literary Facebook for the teenage set.
Chester Higgins Jr./The New York Times
Jacob Lewis, one of the founders of
“I really went into it and thought, ‘We’ll be the social network for young-adult fiction,’ ” said Mr. Lewis, a former managing editor of The New Yorker. “But it became clear early on that people didn’t want a new Facebook.”
The young people on the site weren’t much interested in “friending” one another. What they did want, he said, “was to read and write and discover new content, but around the content itself.” will be unveiled on Monday as an experiment in online literature, a free platform for young people to read and write fiction, both on their computers and on their cellphones. Users are invited to write novels, short stories and poems, collaborate with other writers and give and receive feedback on the work posted on the site.
The idea for Figment emerged from a very 21st-century invention, the cellphone novel, which arrived in the United States around 2008. That December, Ms. Goodyear wrote a 6,000-word article for The New Yorker about young Japanese women who had been busy composing fiction on their mobile phones. In the article she declared it “the first literary genre to emerge from the cellular age.”
Figment is an attempt to import that idea to the United States and expand on it. Mr. Lewis, who was out of a job after Portfolio, theCondé Nast magazine, was shuttered last year, teamed up with Ms. Goodyear, and the two worked with schools, libraries and literary organizations across the country to recruit several hundred teenagers who were willing to participate in a prototype, which went online in a test version in June.
“We wanted people to be able to write whatever they wanted in whatever form they wanted,” Mr. Lewis said. “We give them a piece of paper and say, ‘Go.’ ” He added that so far contributions had included fantasy, science fiction, biographical work and long serial novels. “There’s a very earnest and exacting quality to what they’re doing.”
Teenagers and their reading habits have been the subject of much fascination in the publishing industry lately. They were a huge driving force behind best-selling books like the “Twilight” series by Stephenie Meyer and the crop of paranormal-romance books that followed. Publishers are eager to learn more about their reading habits and introduce books to them.
Mr. Lewis said he hoped Figment would eventually attract more than a million users and serve as an opportunity for publishers to roam the Web site looking for fresh young talent, or promote their own authors by running book excerpts. “For publishers this is an amazing opportunity to not only reach your consumers but to find out really valuable information about how they are reading,” he said.
Several publishers have already signed on. Running Press Kids, a member of the Perseus Books Group, will provide an excerpt from “Purple Daze,” a historical novel for teenagers written by Sherry Shahan. (Figment charges a small fee to publishers for the privilege.)
David Steinberger, the chief executive of Perseus, said he saw Figment as an opportunity to get the company’s content in front of teenagers.
“The teen culture is a constantly moving target,” Mr. Steinberger said. “We’re looking for partners who are deeply embedded in the way teens interact.”

Book Review - The Master Switch - By Tim Wu -

This book looks to the present forces at play  in the telecommunications and web industries and proposes that the impact of the changes are significant and definitive.
I tend to agree to this presentation and concurr in its main proposals: that we must be aware that we are living amidst a structural change of human society.

Worth of reading.

Book Review - The Master Switch - By Tim Wu -

From Hobby to Industry

Shortly after the United States developed the first atomic bomb, J. Robert Oppenheimer realized the country would need a new kind of weapons laboratory. This lab would maintain and improve the military’s arsenal rather than create new weapons. It would be called Sandia National Laboratories and placed not far from Los Alamos.
Illustration by Harry Campbell


The Rise and Fall of Information Empires
By Tim Wu
366 pp. Alfred A. Knopf. $27.95


Initially, the University of California ran the lab, but President Truman soon decided to transfer its operation to the entity he thought could best run it during the nascent cold war: AT&T. “In my opinion,” Truman wrote to an AT&T subsidiary in 1949, “you have here an opportunity to render an exceptional service in the national interest.” AT&T ended up running Sandia until the early 1990s.
It was one of the more extraordinary instances of Ma Bell’s involvement with Uncle Sam. The company owed its very existence to a favorable federal patent ruling in 1878, which saved it from an early death at the hands of Western Union, the dominant telegraph company then trying to crush its new rival. A little more than a century later, Washington broke up AT&T. But regulators soon allowed many of the company’s parts to merge back together. This consolidation, Tim Wu argues in “The Master Switch,” probably allowed the Bush administration to conduct its wiretapping program in secret for so long.
AT&T is the star of Wu’s book, an intellectually ambitious history of modern communications. The organizing principle — only rarely overdrawn — is what Wu, a professor at Columbia Law School, calls “the cycle.” “History shows a typical progression of information technologies,” he writes, “from somebody’s hobby to somebody’s industry; from jury-rigged contraption to slick production marvel; from a freely accessible channel to one strictly controlled by a single corporation or cartel — from open to closed system.” Eventually, entrepreneurs or regulators smash apart the closed system, and the cycle begins anew.
The story covers the history of phones, radio, television, movies and, finally, the Internet. All of these businesses are susceptible to the cycle because all depend on networks, whether they’re composed of cables in the ground or movie theaters around the country. Once a company starts building such a network or gaining control over one, it begins slouching toward monopoly. If the government is not already deeply involved in the business by then (and it usually is), it soon will be.
Wu argues that it has little choice. Not only are communications businesses particularly prone to consolidation, but the political effects are far greater than they would be in other industries. The book’s title comes from a line by Fred Friendly, the longtime CBS News executive, in which he distinguished between free-speech laws and “exclusive custody of the master switch.” They are two different things, but either has the ability to shape the flow of information. The same cannot be said, Wu notes, “of orange juice, heating oil, running shoes or dozens of other industries, no matter their size.”
Today may seem an odd time to be making this argument, given the online flowering of discourse, civil and otherwise. But Wu makes a good case that the Internet is vulnerable to the cycle. The world’s computer network is ultimately a physical entity, onto which other forms of communication — film, telephone, television, radio — are starting to migrate. This is what media executives mean by “convergence.” It seems likely to help big companies get even bigger, and arguably offers the potential for even tighter control of information than existed in the past.
Wu’s candidates for the AT&T of the future are Comcast (if its takeover of NBC-Universal succeeds), Google (if it decides to abandon its tradition of openness and instead tries to eliminate rivals) and some combination of AT&T itself and Apple. But he ponders this only briefly and acknowledges that it is too early to know. His most thought-provoking argument about the future may actually be the past.
The similarities between radio and the Internet are particularly striking. Radio in the early 20th century was a scattering of amateur stations, started not for profit, but so its hobbyists would have a platform for their views and interests. Lee De Forest, a pioneer who ran a station in the Bronx, urged young people to listen so that in “the still night hours” they could “welcome friendly visitors from the whole wide world.” Waldemar Kaempffert, the editor of Scientific American, proclaimed, “All these disconnected communities and houses will be united through radio as they were never united by the telegraph and telephone.”
Soon, however, radio’s heterogeneity began to fade. In 1922, WEAF in New York, AT&T’s flagship station, broadcast the first advertisement, for a leafy housing development in Queens. By 1926, AT&T had teamed up with the Radio Corporation of America to form a new radio network, the National Broadcasting Company. The station, at 660 AM, would become known as WNBC (and later as WFAN, the country’s first radio station devoted to sports).
AT&T and the radio manufacturers wanted radio to transform from hobby to big business, so they decided to fight back, publicly and in the courts, against a commerce secretary who had been protecting radio from what he called “advertising chatter.” That commerce secretary was Herbert Hoover. A federal court eventually ruled that Hoover had no authority to assign radio frequencies, and NBC’s network mushroomed. The cycle was under way.
Wu wisely avoids magic-bullet solutions to the inevitable problems of the communications industry. Simply keeping government out of the business does not work, because the industry tends to form private monopolies if left alone. And having the government run the business, as England and other countries have tried, presents its own problems. The government itself is a monopolist and often behaves like one. Wu instead calls for constraining “all power that derives from the control of information.” He writes, “If we believe in liberty, it must be freedom from both private and public coercion.”
In practice, this would mean that the Justice Department would have to adopt a broader definition of its antitrust powers, beyond its typical emphasis on competition’s effect on prices. The longstanding Hollywood censorship code did not raise ticket prices, after all, but it did violate the country’s ideals. Similarly, AT&T had the power to quash the answering machine, which, incredibly, a Bell Labs engineer invented in 1934. The “government’s only proper role,” Wu concludes, “is as a check on private power, never as an aid to it.” It is a useful rule of thumb, even if not every choice breaks down quite so clearly.
Wu previously popularized the phrase “net neutrality,” an updated version of the old notion of “common carriage” in the telephone industry. The central idea, as with that of this book, is that market competition brings enormous benefits, but the market itself does not ensure competition — or, more broadly, desirable outcomes.
This argument can be extended to the economic importance of communications, even if Wu’s concern is more political. The businesses he describes have been some of the American economy’s great global successes: Google, NBC, Paramount Pictures and many others. They were able to lead the way in part because they could take advantage of the fruits of this country’s long tradition — now somewhat diminished — of government investment in basic scientific research that the private sector would never find profitable.
The Internet, to take one example, may now be the world’s communications network. But it started as a Defense Department project. As “The Master Switch” artfully shows, the government often has a role that no company will play on its own.
David Leonhardt is an economics columnist for The Times.