concept design, model design and production concept design through construction documents concept design, illustration, model design and production design development, construction documents, and construction administration concept design and illustration construction documents concept design and illustration concept design and illustration construction documents concpt design, illustration, and model production concept design and illustration design development and construction documents interior design and construction documents concept design, illustration, model production, and graphic design interior design and illustration concept design and installation design, construction documents, and installation interior design concept design and illustration concept design, illustration, art direction, 3D modeling for digital fabrication concept design and illustration writing, illustration, editing and graphic design illustration and graphic design concept design, illustration, 3D modeling, rendering, simulation, and prototyping writing and illustration writing writing scripting, modeling, testing and analysis overall design and construciton documents construction documents design and development of user interface concept design and illustration concpt design and illustration concept design and illustration construction documents
Tyler Mariné, Architect first & mission towers bfs yuetan center intuit marine way building king city ranger station hayes valley playground presidio townhomes adobe utah campus options fisher museum crissy field uc davis student services center el camino hospital south lake tahoe housing crafton hills college library westin galleria dallas completion arquine first prize magnolia gastropub germania st french balconies germania st radiant floors germania st interiors south hayward housing westin galleria dallas design mission bay lab aadrl thesis docs aadrl agdis jury posters aadrl agdis morphogenesis urban interface  herding urbanism  aadrl workshops  santa barbara houses paul tuttle retrospective  undergraduate digital potfolio  undergraduate thesis paris internship undergraduate design chicago houses
view my LinkedIn profile
download my resume





Agdis presented a proposal for the development of a family of open-air structures in Hyde Park. Our material prototype, whom we have fondly named Jack, is a prefabricated structural unit with natural variety of shape; similar to the types of variety found in crystal geology or cellular biology. By assembling a critical mass of Jacks, a characteristic formal outcome is produced with a most unique set of structural qualities. These material aggregations provide lightweight temporary structures to house or flank an innumerable list of potential architectural proposals. In an effort to address the spontaneous nature of public activity in Hyde Park, and indeed the free, organic nature of urban development in general, Agdis proposes an architecture without mechanical connections, without hierarchy; an architecture without any apparent order.

Our initial tests of Jack’s structural viability resulted in dozens of tree-like forms such as the one shown below. In each case, the abilities of such a light-weight structural system without any designed architectural connections baffled us. It became clear that this type of aggregate could be used as an infill system to take compression loads at a fraction of the overall mass used in traditional construction techniques.

Jack v.1.0 was clearly based on the 6-pointed geometries found is the most basic crystals. Our first revisions to Jack v.1.0 attempted to increase the amount of friction between units. A major challenge in the design of the original unit was its individual mass to displaced volume ratio. If the unit is not dense enough in its volumetric space to catch anything and hold it, other Jacks simply fall through the large gaps. Subsequent designs aimed to reduce the size of local gaps to increase bonding. In order to simplify the construction of our Jacks we began limiting the number of parts used to complete the unit. The original Jack had three pieces that needed to be assembled and glued to ensure rigidity.

We designed 28 different types of Jack to test for a superior performer. The data from our initial testing is displayed here, showing the remarkable ability of the light-weight Jack to displace volume by nesting to resolve forces through an emergent system of friction connections. While it is true that some Jack configurations performed better than others, the displacement is quite high among the entire family.

While the initial physical testing of Jack proved provocative and we were quickly hooked on it as a design subject, understanding the behaviour of an aggregate was nearly impossible. Although photographs of piles and charts of pile characteristics were giving us relevant information, we wanted to understand the dynamic relationships that exist between each individual Jack. The Agdis Simulator has been one of our most challenging design exercises. This level of application programming, or tool building, requires an in depth understanding of how Maya works. At this level, Maya is limited only by the hardware that supports it. This screenshot shows one pile test. There are three types of jacks available in this version of the app. Each has a set of dynamic attributes that can be modified, such as scale, mass, bounciness and damping, and static and dynamic friction.

One of the most interesting aspects of the Agdis Simulator is its ability to display each Jack’s contact information. Using a variety of MEL scripts we were able to export this type of data and diagram it, giving us a visual means to understand how an aggregate behaves structurally. Both contact relationships and contact counts are shown in the chart above. Designing through code is an arduous process, and results are difficult to achieve. In fact, most times when a piece of code is designed well, and works properly, nothing happens at all. The sign of success is simply a lack of syntax errors.

Once a pile has been generated by the simulator, different MEL scripts are used to generate the necessary statistical data for analysis. In addition to contact data and mass to volume statistics, there are also tools for converting rigid bodies from active to passive, and back again, in order to minimize the number of changing relationships Maya’s collision solver has to calculate: passive rigid bodies are not subject to outside forces, but act as stationary obstacles for other, active rigid bodies. By toggling this attribute on and off, jacks that have reached stability can be locked in position freeing up the solver, ultimately allowing for larger and larger piles to develop.

The most difficult design challenge for Agdis was to find a way for Jack to perform not only under compression, but also under tension. If the aggregate could only be used as a lightweight compression system, then another system would have to be used to contain it, and while we designed at least 6 "Jill" systems, not one of them proved to be as interesting as our initial piece. The goal was clear: in order for the architecture to be as progressive as the material, it alone had to be the architecture, without any secondary support systems. Two major breakthroughs made the above goal possible. The first is the introduction of a hooking mechanism at the end of each limb. The second is a design that lends itself to mass production. The beta version of Jack v.2.0 shown above includes both of these important revisions. First, the hook is included at the tip of each limb on every Jack. Second, the unit is composed of two flat pieces that slot together at the center to provide a rigid connection at the core of a new 3D object.

Since each piece was now simply a flat design, using a laser cutter to manufacture them was an easy decision. The technology is actually very simple: the laser cutter reads AutoCAD drawings just like a plotter, so no new software was necessary to start production. The crucial decisions now were choosing the most effective material and thickness to get the most successful performance. So long as the material strength can support the weight of the aggregate, the unit can take any load introduced. Once we had established an effective design, and manufacturing technique, we began testing different materials for behavioural characteristics. If the material was too rigid, such as lightweight wood or acrylic, then breakage became a serious problem. All affordable metals were simply too heavy, and a hollow unit meant an entirely new design and construction method. At last, we discovered an effective material, and Jack v.2.0 was released. A high impact flexible plastic, PETG, which is polyester based, performs extremely well under tension, and proves to be lightweight and super durable. Load tests were extremely successful in both tension and compression.

In order to include both tight local connections and long-range tiebacks within the aggregate, we decided that both long and short limbs were needed. But determining a specific ratio of long to short limbs, or the most effective length of both, was obviously an insane undertaking. Again, we took our cue from nature, and devised a generative script that randomly produced a range of lengths for each jack. Every unit is therefore unique in this respect. This MEL script imports individual parts from AutoCAD, and arranges them in the above pattern in Maya per the specified multiplier. Every set of Jacks is different, and arranged in a grid for an easy return to AutoCAD for laser cutting.

The studio brief for our thesis project was to design a series of pavilions for the potential 2012 Olympic Games in London. Our site selection was Hyde Park, and our Pavilions are designed to host the three stages of the triathlon. The five pavilions are placed around the whole of Hyde Park at entry points and locations of activity for the events of the 2012 Olympiad. Beginning at the Swim-to-Cycle, the first pavilion locates media and event support as the central event headquarters. The final four pavilions mark entry points to the site and run adjacent to the cycling and running routes. Given the size of the park and the expected crowds, numbers in excess of 400,000 people, these markers are designed to be visible from anywhere in the park. The nature of our aggregate yields a remarkable stability under wind loading, by effectively absorbing such forces within the voided spaces of the aggregate. In addition, the large amount of surface area within the system provides an effective water shedding system, catching rainwater and distributing it evenly in order to avoid intense runoff locations, which would damage soil conditions.

In addition to the Olympic brief, Agdis introduces an entirely unique architectural proposal to build up to the 2012 Olympic Games. The Intention of the Agdis Freeform Project is for the general public to get involved in constructing pavilions as a playful act of creation. While our Jacks are lightweight at an average of 50cm across and easy to toss around, their tendency to grab one another means that at a certain critical mass they become impossible to move or dismantle. Over the course of the years leading up to London’s 2012 Olympic Games, freeform pavilions would accumulate, augmenting the landscape of Hyde Park in a very dynamic way. Our visions of Hyde Park infused with Agdis pavilions are meant to feel very natural, as if the structures have grown slowly over time. Due to the consistent activity in the park very useful alcoves would likely develop where people gather. As people continue to build an architecture that suits them both individual statements and collective establishments would likely occur.

Geomorphology is the study of landforms. In order to understand as much as possible about the development of the formal aspects of the earth, scientists use data from existing land forms, data gathered in the study of geomorphic processes, and any recorded history of the land surface. Until the second half of the last century, geomorphology as a discipline had been widely understood as a series of natural processes that operate independent of human intervention. But as populations around the world have grown exponentially, and Industrialization has dominated so many regions, it is becoming evident that geomorphological processes are not in fact independent of human intervention. Environmental scientists and political activists have been pushing for awareness and policy change since the late sixties, but arguing from a purely scientific “earth first” perspective. New sciences and ontological concepts are moving away from these separatist understandings of humanity and geomorphology, to a systems approach which argues that humanity is simply a part of one grand process.

In Britain, such processes as direct excavation, urban development, and waste dumping are driving landscape change. Some 688 to 972 million tonnes of earth-surface material is shifted deliberately each year, the precise figure depending on whether or not the replacement of overburden in opencast mining is taken into account. British rivers export only 10 million tonnes of sediment to the surrounding seas, and some 40 million tonnes are exported in solution. The astonishing fact is that the deliberate human transfers move nearly fourteen times more material than natural processes. The British land surface is changing faster than at any time since the last ice age, and perhaps faster than at any time in the last 60 million years. 1

The Lea River Valley in east London, for example, has a particular formal quality classifying it as a fluvial landscape, which means its current form is due to a particular type of erosion caused by rivers and streams. But it is impossible to attempt a geomorphological investigation of the Lea Valley without considering the impact of human intervention. Most of the waterways in the Lea Valley are in fact not natural at all, but created by humans to continue the growth and development of a particular urban condition. The science of geomorphology at its root is a study of displacement and distribution. In the case of the Lea Valley, waterways carry earthen material from one place to another. But if we take a systems approach to the geomorphic development of the valley, without isolating humanity as a condition external to that system, then we can begin to understand how mineral deposits can in fact move in the opposite direction of erosion, to change the form of the land.

Throughout human history, waterways have been used to transfer energy and materials from one place to another and in doing so have undergone intense urban development. Let’s be clear on what I mean by urban development; population increase leads to construction and infrastructure, leading to continued population increase, and so on. DeLanda would refer to this as the autocatalytic loop of urban growth. Regarding the accumulation and configuration of materials that create this urban growth, DeLanda refers to Ian G. Simmons:

Simmons views cities as veritable transformers of matter and energy: to sustain the expansion of their exoskeleton, they extract from their surroundings sand gravel stone and brick as well as the fuel needed to convert these into buildings. He notes that, like any system capable of self organization, cities are open (or dissipative) systems, with matter-energy flowing in and out continuously. 2

Many of the traditional data sets that are studied in Geomorphology are easy to see, thanks to the widespread availability of satellite photography. Water quality for instance is a major issue in London, as the Thames is one of the more polluted waterways in the world. As we follow the Lea River south through its valley, we can see its colour change as evidence of contamination by the industrial gutter of its southern mains and tributary canals. The general practice of flat paving sites for the storage of industrial supplies, equipment, and machinery has drastically increased the consequences of chemical and mineral runoff. We can also see that in cases of abandonment, where technology has moved on from previous methods, the subsequent spaces that facilitated those earlier technologies are showing signs of natural healing, similar to that of the scabbing over of open wounds on the surface of living organisms.

We may call upon Deleuze and Guattari here, for the use of their abstract machine. DeLanda refers to it in discussing the sorting mechanism in geological processes, and uses the abstraction of these processes to apply its concepts to the development of urban networks.

I wish to argue…that there are also abstract machines behind the structure-generating processes that yield as historical products specific meshworks and hierarchies. 3

It is my contention that the construction aspect of urban development can be understood as a certain geomorphogenesis. A flow of energy, when brought into contact with a certain set of materials, produces a particular mineral organization. Coal for instance, develops when dead organic material, in the absence of oxygen, is met with pressure. In the case of the Lea Valley, a certain set of energy flows met with a certain set of materials, leading to the development of a particular urban condition.

1. Hugget, Fundamentals of Geomorphology, p. 62
2. Delanda, A Thousand Years of Nonlinear History, p. 76
3. Delanda, A Thousand Years of Nonlinear History, p. 59


Biomorphology is the branch of biology that deals with the form and structure of organisms. Biomorphogenesis is the process by which the form and structure of an organism becomes. Inherent in this process is evolution. But new science is beginning to find flaws in Darwinian Theory. Again, nonlinear dynamics and extensive computing capabilities are showing that Darwin may have overestimated the authority of natural selection. Darwin’s evolutionary process aims at an optimal form: that which is “fittest” for a particular set of circumstances. But to understand evolution as an optimizing process firstly gives too much credit to genes, and secondly, reinforces the belief in equilibrium as an eventual state.

Darwin stresses the accidents of history; random shuffling of the genetic pack; competitive interactions between individuals for scarce resources; and the power of natural selection to prune the weak and the unadapted, leaving those that are fit for further reproduction and the perpetuation of their superior genetic legacy. 1

Darwinism dose not account for the prevalence of generic forms that seem to occur quite frequently in nature, such as whorls or spirals. These patterns don’t appear to be accidental, nor do they seem necessary for survival. Morphogenesis is a process that has intrinsic properties of dynamic order so that particular forms are produced when the system is organized in particular ways.

…in the context of evolution: there is an inherent rationality to life that makes it intelligible at a much deeper level than functional utility and historical accident. 2

The problem with Darwinism is its inability to account for so-called “arms-races” in nature, where we find the evolutionary process in constant, complimentary, imbalance. This chaotic state of affairs is what can bring about drastic changes in form, and in the spirit of non-centrist human history, one might see the growth and development of humanity in this light. “Progress” can be understood as a series of critical thresholds where phase-shifts have occurred. I propose this concept can be used to understand urban development as an organic, evolutionary process, in a constant computational operation to negotiate the forces that sustain it. I have tried to show that the mineral form of a city is directly linked to the vectors that allow its existence, but in order for life to occur, energy must be provided.

The 21st century city is entirely dependant on the non-urban to provide food, and the flow of that resource is the defining force which moulds an urban condition. The European population explosion of the 18th century allowed European cities to expand dramatically as colonial trade expanded the surplus of food. This kind of urban development determined by energy flow is manifest through the construction of machines to process, sort, and distribute that flow of resources. In the Case of the Lea Valley it’s clear that the waterways have been used as a link to resources, and the necessary sorting mechanisms are the bulk of its architecture. This operation between variables, a negotiation if you will, can be found at all scales of urbanism. It would be a mistake to see the urban form produced as static. We must understand the city as pure process, constant computation. It would also be a mistake to see this process as an optimizing one. The formal configuration of the Lea Valley shows 18th century London’s vital dependency on shipping. It also shows its 20th century shift away from shipping and toward air freight, in combination with its suburban expansion. In particular you’ll notice in the conversion of distribution zones such as wharves and factories to residential and commercial zoning, as well as the abandonment of rail services’ direct connection to the Thames.

…despite the many differences between them, living creatures and their inorganic counterparts share a crucial dependence on intense flows of energy and materials. In many respects the circulation is what matters, not the particular forms that emerge. As the Biogeographer Ian G. Simmons puts it, “The flows of energy and mineral nutrients through an ecosystem manifest themselves as actual animals and plants of a particular species.” Our organic bodies are, in this sense, nothing but temporary coagulations in these flows… 3

In the context of the urban, the spatial organization of these flows, and the subsequent forms arising from their combinations, is not random. I suggest that the process operates in the same manner as organic life. In chapter 5 of The Evolution of complexity, Brian Goodwin explains two operations that work together to produce robust structural forms of life: generic forms and genetic evolution. There are over 250,000 different species of higher plants. Underlying this diversity is an unexpected and startling degree of order. Despite the profusion of leaf shapes in higher plants, there are basically only three ways in which leaves are arranged on a stem. The prevalence of these 3 patterns is due to a certain condition in the Meristem. Goodwin describes this condition as:

A morphogenetic field, defined primarily by the mechanical strains in the surface layer of epidermal cells acting as an elastic shell that resists the pressure exerted by the growing tissue underneath. 4

The patterns of leaves in higher plants are clearly an example of generic biological forms - naturally stable states of a generative process in the developing organism, in his case the meristem. Genes define the region of parameter space where a particular species starts its development. That is to say genes seem to operate as a set of variables easily worked through in a system that is already stable. The changes of shape described in the organs of a flower are examples of homeotic transformations because one organ is replaced by another structure that belongs to the same neutral set of forms. The model used to explain how this process works is a spatial diagram of the regions of influence of particular genes.

I have made an attempt here to show similar diagrams of a particular urban condition, and I contend that the same forces are at work. In an extraordinarily diverse world, where everything looks different, we can find that generic structures underlie systems, making them similar, but not the same. In the realm of biology, homeotic mutants tell us that structures that look quite different are in fact easily transformed, one into another, by the effects of single genes. This gives us very important information about two aspects of morphology: that in the space of possible biological forms certain structures are close neighbours from a generative or morphogenetic perspective, even if they have quite different shapes; and that the clustering of certain forms in regions of shape space suggests why they occur over and over again in different species. Genes naturally do the sensible thing: they cooperate with the generic forms of the field to give robust morphologies to organisms. Genes can influence a lot of secondary properties of these forms, but the generic properties cannot be denied. This combination, of generic forms and genetic variations on their themes, results in both the diversity of form in the biological realm, and the intrinsic order of these forms that allows them to be classified.

The Lea Valley shows signs of biomorphogenesis at all scales. The entire valley is operating as a sorting and distribution device of bio-energy for an entire quadrant of greater London. These energy flows are the driving force for all architectural and urban form. The same operation can be seen at major nodes in the resource network as it meanders up the valley, and it can also be found in individual neighbourhoods as they organize in relation to those flows. The most interesting aspect of this morphological view of the valley is that it moves our understanding of the urban away from static or utopic models, and toward process and computational models found in the new sciences.

1. Goodwin, The Evolution of Complexity, p. 115
2. Goodwin, The Evolution of Complexity, p. 116
3. DeLanda, A Thousand Years of Nonlinear History, p. 104
4. Goodwin, The Evolution of Complexity, p. 120


Thresholds are the frontline of morphology. Morphogenesis occurs at the interaction of multiple systems, and not within homogenous fields. In following this logic, one might refer to these thresholds as an interface: a condition occurring between systems where each transfers a ‘data set’ to the other(s). These data sets can be vector forces, disease, densification, dispersal, or any other type of information, both quantitative and qualitative. Interface is the stage of interaction; it is the boundary... not in a material sense, but in the abstract; the plane of communication. This plane is far from smooth. Interface is all about friction, conflict, and grand gestures. If we accept the urban as a morphological process, how can we design for that process? Most architectural projects work toward a so-called completion, but the nature of morphology is constant dynamics, not achieved equilibrium.

The Lea Valley is just such a morphological interface. It provides the necessary facilities needed by the east end of London to transfer energy and materials to and from the world outside greater London. The reason for its classification as morphological is because its interface is formal. In order to harness and manage its resource flows, London has produced a set of formal machines that take in and distribute matter and energy. This interface must be flexible because the population of London is shifting, as is its technology and its needs. As we see in its biomorphogenesis, these flows and their facilities are not permanent, but subject to overloading or abandonment depending on variables in a massive urban matrix. The last thirty years have seen a fair amount of decay in the industrial fabric of the valley, and an insurgence of residential and white collar commercial development in its southern sectors. But prior to that decay, the Lea Valley was the main artery for the transport of goods for hundreds of years. In fact, it might be said that, on an urban scale, we are witnessing a morphological phase change… The initial catalyst for industry in the valley was England’s massive extraction and consumption of coal, which caused the urban to shift to an industrial phase.

…the rise of the “industrial age” will not be viewed as the result of human society having reached a new “stage of development” (a new mode of production) or of its having climbed further up the ladder of progress, but, rather, as the crossing of a bifurcation where previous autocatalytic dynamics (subject to negative feedback) came to form a self-sustaining autocatalytic loop. 1

At the onset of the 21st century, we find that industrial development on a whole is becoming clean, focusing on information technologies and environmentally sensitive production. We might see this shift as a counter punch to the population explosion and expansion of dirty industry that preceded it. This is an excellent example of an arms race in nature, where one system is in negotiation with another system, each evolving side by side, but in an endless process without any goal save avoiding extinction brought by the other. In The Evolution of complexity Brian Goodwin describes the role of genes in the morphogenetic development of plants as:

…the region of parameter space where a particular species starts its development. This is determined by such quantities as the turgor pressure within the cells of the meristem, the mechanical properties of the cellulose microfibrils, the composition of the cell walls, the activities of pumps and channels that regulate concentrations of ions such as calcium, and a host of other properties that have been extensively studied by plant physiologists. 2

What is interesting and relevant to architecture and urbanism here is the range of inputs found at work in this morphogenetic interface. Measurable forces like pressure and concentration of physical elements a put through a computational machine, an interface, in order to produce a formal result. If we read the above selection only retaining terms like “mechanical properties” or “pumps and channels”, the leap to our Lea Valley is not so hard to imagine.

But where is the interface? How does it operate? What does it translate? I argue that the interface is the valley in a very physical sense. For example, let’s take a look at the fishing industry. This source of biomass is brought up the Thames, to nodes on the banks where the energy is converted into a workable form, i.e. processed or packaged for delivery to a secondary set of distribution stations. These secondary nodes make additional modifications to the biomass, in order for it to be consumable. In this process we see two different phases of the material in question. First a living organism, second a consumer product. In order for material to move from one phase to the other, it must be processed through a morphogenetic interface. This interface is in constant computation of its range of inputs to translate between phases, such as commercial value, or formal reorganization for further transport, or preservation of nutritional value. Each of these changes is essential for the transfer of biomass to occur in a fully urbanized condition. But we can easily arrive at a chicken or egg line of questioning here… The urban condition could not exist without such interfaces, and the interfaces would not exist if not for urbanism. The Lea Valley operates as this interface, buy containing the necessary machines for the process of translation.

1. DeLanda, A Thousand Years of Nonlinear History, p. 73
2. Goodwin, The Evolution of Complexity, p. 133


Richard John Huggett, Fundamentals of Geomorphology (London, 2003)
Brian Goodwin, How the Leopard Changed his Spots: The Evolution of Complexity, Reprint Edition (New York, 1994)
Manuel DeLanda, 1000 Years of Non-Linear history, swerve edition
(New York, 2000)

Five years ago, new media theorists were heralding the development of the interface and the new digital worlds we were creating. Lev Manovich and Steven Johnson both wrote spectacular biographies on the modern interface at the close of the 20th century. In ‘Navigable Space’ and ‘Interface Culture’, we follow the computer’s evolution. In both cases the author speaks to great expectations for the future. And as our formers expected, the new digital interface has become the dominant tool for research, communication, and expression throughout the industrialized world. No achievement of man can compare to its speed or impact, and new developments are presented every day. I would like to take a look at some systems of navigation through information space, in particular a few whose interface is urban. It is my contention that recent advancements in the design of virtual environments and their navigation systems can offer a critique on urbanism as a discipline and a culture.

I do not, of course, want to write an “update” to Manovich, or Steve Johnson for that matter. What I am interested in is how new media and the interface have augmented our understanding and development of the urban. While Johnson’s biography of the interface is compelling, it is of a technology that moves too quickly to be compiled or understood; and Manovich offers great insight into the cultural side of games, but from somewhere in the middle of the landslide of technological development. What makes these writings relevant is also what dates them. Like an old version of shareware... absolutely essential, but only in the development of something beyond itself. It seems that the two pieces I am focusing on were written just before another major breakthrough... which is of course followed by another breakthrough, and so on, and so on, and so on... I have selected a few recent applications to discuss due to their unique focus on the urban. While these applications are, on the one hand, evolutionary, they are also revolutionary. I will look at the navigation systems of SimCity 4 and GTA Vice City, as well as their urban fabric and cultural implications. This paper is an attempt to reveal how we understand and negotiate the urban, considering the impact of two particular computer games. These simulations offer a “safe mode” for life, each providing a certain psychological band-aid for the leery user-next-door. But if we look closely, there is a lesson here for the urbanist as well... a clue as to how the city is changing, and what new forms it may begin to take.

Each of these has a fairly long history of development. If we look at the narrative structures of these applications - a popular comparison used by Manovich and Johnson - these new versions are nothing more then an evolutionary step in interface design. Just for a moment, I would like to look at that evolutionary development, as I think it will help emphasize the immediacy of their cultural implications. In the last six years the digital gaming industry has seen an amazing amount of growth. The technology is progressing so fast that it would seem there is no obstacle to development. But the variety of gaming concepts has changed little in recent years, in order to concentrate on pushing the boundaries of their spatial character. In general, the narrative structures of these two games are over fifteen years old, and that is only within the context of the digital interface. It could be said that the adventure and simulation genres have existed for thousands of years. I propose the front of development in the gaming industry is not led by the conceptual designers of games, but by the designers of hardware. Successive iterations of high speed processors, combined with increasing memory capacities, are improving machines, allowing for more and more complex versions of the same stories.

In the case of adventures, a single hero is introduced to a discreet space and offered a controlled degree of navigation and participation. We have seen this example countless times. There have been more or less successful behavioural design from game to game, and some in particular left a considerable impression on the industry. The very first arcade games, such as Pong, Space Invaders, or Arknoid, presented a closed environment in which the player was to manoeuvre. In these cases there was a clear division of space, and although the entire frame was engaged, there was no environment to explore. In the early to mid eighties arcade and home console games like Kung Fu or Super Mario Bros. moved toward a certain style of linear progression and successive achievement. Although this change offered an extended environment beyond the closed frame of the screen, the user was offered no choice in direction or pursuit. These were completely flat environments, presenting a static background and a singular linear progression, to the right. These early attempts at spatialization were totally dependant upon hardware capabilities. An excellent example of this can be found in the background design of these panel-style interfaces, where complexity of background was directly linked to available memory. When I was 9 years old The Legend of Zelda was released by Nintendo. The plan view of Zelda made it an enormous success, but there is one specific aspect of this particular game that subtly changed the nature of all games to come. In Zelda, an entire world was offered and it was up to the individual to decide what path to take. Each player had a unique experience, participating in their own sequence of the events at their own pace. It is this narrative structure that underlies all spatial adventure games today.

The simulation offers a set of tools to facilitate a specific set of activities or operations, and a discreet space to perform them. Considering the gaming industry from this point of view, we see a consistent amount of progress in the development of behaviour sets. The on screen agent is programmed to respond to user input through a specific set of control combinations. This type of game is also quite classic, considering the capabilities of the original Athletic games designed for home consoles. Nintendo’s Track & Field, or Mike Tyson’s Punch Out, were very basic attempts at simulation. In the early days of home simulation, the PC was where real progress was being made. Military simulators easily crossed over to the entertainment market offering fighter jets, battleships, and submarines. It might be said that this quick jump pushed gaming into another split of sorts, with some games focusing primarily on issues of behavioural content, or realistic control and response.

The contemporary titles I mentioned earlier are direct descendants, and slight crossbreeds, of the classic adventure and simulation genres. In the case of Grand Theft Auto, the basic narrative structure is that of The Legend of Zelda. You are given an on-screen agent and presented a discreet environment in which you can perform a limited set of operations. As you accomplish particular goals, new opportunities become apparent. The other vital similarity is the freedom to explore the environment without any time-based consequences. SimCity 4 offers and unprecedented level of complexity, in a long running series of urban development as entertainment. The level of control in this new version allows for detailed geomorphological control over randomly generated earthen realms. Atmospherics are breathtakingly realistic, and the depth to which this reality sinks is unimaginable. Individuals living in simulated urban networks carry on full lives, from commuting to working to causing trouble. There are worlds to get lost in, bound only by imagination… and code.

The most important fact regarding these two titles is not, however, within the confines of the games themselves. Grand Theft Auto: Vice City sold 5 million copies before it was even released. 1 SimCity 4 was the number one PC game in North America in 2003. 2 These two titles have had unprecedented exposure. Is it possible that this exposure has affected our understanding of the urban condition? Can these games be viewed as a critique on how we design our cities? In the case of SimCity 4, there is no denying the appeal of controlling capitalism and politics, and this, I would argue, is the most psychologically important feature of the game. The creator if SimCity, Will Wright, is well aware of this point.

Yeah, a lot of people like the control aspect. To them it’s sort of “I’m a god, you’re my sim, obey me.” But the sims don’t obey. That’s what makes it fun. If they obeyed you, what fun would that be? You’re trying to keep this city together, but it keeps falling apart. That’s what makes most really good games fun: When you have a certain amount of control, but then there’s a certain amount of entropy in the system, and it’s balanced just right between the two. It’s life at the edge of chaos. Shit happens. But of course I don’t have a variable in there called “the chaos variable.” So, to tune it I have to engineer a lot of little low-level variables to try and get the emergent behaviour that will give me that edge of chaos to keep it from getting boring. 3

What I find interesting about SimCity is its immense popularity, and what that might say about the culture that follows it. First of all, a desire for control, as Wright has stated, is obviously its core appeal. For those that are frustrated by the impossibility of control of the real world around them, simulations of the urban provide a safe fantasy realm for the indulgence of all kinds of control mechanisms. I do find the full faith and devotion to the capitalist model a bit concerning. Here we find traditional evolutionary theory at its most rigid. Time marches on and your cities grow unbounded. Proper management of resources and a prudent eye on the finances yields economic surplus, which means more development. As an architect the game can be a bit frustrating, with its limited palette of building types, but its development model is perfect for those without the means to buy real land and construct real buildings. This game can be seen as the white collar fantasy world of suburban America, the ultimate simulation of manifest destiny and the American dream… you too can be a Rockefeller.

There couldn’t be a more appropriate name for the latest release in the Grand Theft Auto series. Vice City is exactly that, a haven for all vices, and the freedom from consequences allowing one to indulge in violent excess. And with millions of copies sold, it’s no wonder society in general has become quite concerned with the prevalence of crime and violence in the game. In particular, the GTA series has been linked to violent crimes in teenagers across The United States.

A gang of teenagers in California, charged with plotting carjackings and murder, say their actions were inspired by playing Grand Theft Auto, morning, noon, and night. In Oakland, Calif. a group of young people who called themselves the “Nut Cases” told police they played violent video games before going out and robbing and killing random victims on the street. They said their favorite was Grand Theft Auto. The five men and one woman are facing charges in dozens of robberies and five killings that took place in 2002 and early 2003. The Entertainment Software Association, which represents the video game industry, pointed to research showing that youth crime has gone down even as video games have proliferated. The games are rated for violence, and ultimately, parents make the decisions about what games they bring into their homes, the association said in a statement. “Parents are present at and involved in the purchase or rental of games 83 percent of the time, according to a September 2000 Federal Trade Commission report,” the association said. Family members of those killed in video game-related shooting sprees say it is time to take Grand Theft Auto off the market, before more lives are lost. 4

What is in this kind of free-from-consequences violence that seems to be so attractive to American youth? Are these teenagers making the mistake of confusing reality with the gaming world? Or perhaps many of these gamers are having their first urban experience within the discreet space of Vice City, and learning from it how to operate when faced with its real world counterpart; a sort of urban simulation. While almost every video game produced to date has some manner of violent content, the degree of fantasia in previous titles allowed for an easy separation from reality. It was apparent, even to young children, that the environment on the screen, and the events that occurred in that environment, were not real. But the constant advancements of gaming hardware have caused game design to be quite unlimited in its ability to simulate the real, and so these games become more and more complex, more and more real. In this simulated reality one of the most important advancements has been the development of the digital agents employed in these games. It is my contention that the development of a believable character, an any-man, is essential in the crossover from digital entertainment to “I can do that”. This metaphor is discussed by Steven Johnson in Interface Culture as follows.

One of the puzzling things about agent software is that the surface representation of the agent itself is so malleable. It is a general assumption in this book that our visual metaphors are as important as the underlying functions they signify. On the face of it, agents would seem to be an exception to this rule. The three types of agent – personal, travelling, and social – can be represented in many different ways… 5

But the believability of the agent in these kinds of simulations is key to convincing the gamer of the reality of the gaming environment, and in extreme cases it is what leads to the kinds of violent crimes reported by the national news syndicates. If Gaming continues down the path of more and more realistic simulations of violence, will violent crime rise? This may or may not be a possibility, and there are strong arguments for both sides. But I think the more interesting questions is: when will the designers of games and their play-spaces move beyond earthly simulations, and begin creating completely fantastic, unreal, inhuman, digital entertainment? Has information technology and net interface design taken our best designers, leaving the game design to Hollywood? Or has interface design in the functional capacities of the internet become too stale a field to sustain the interest of digital designers, forcing them to offer their skills to the entertainment media machine? I would contend that neither is true. As I said before, the innovation is not coming from the designers at all, but from hardware engineers building more and more powerful machines, able to compute larger more complex simulations. Because these innovators are setting the pace, Designers haven’t the time to develop entirely new fields of gaming and interface, even though the technology can easily support drastically different types of spatial exploration.

1. Current Sauce, GTA Vice City sells 5 million copies
2. EA, EA Unveils Plans for SimCity™ 4 Rush Hour
3. Kelly, Will Wright: The Mayor of SimCity
4. ABCNEWS, Deadly Inspiration?
5. Johnson, Interface Culture, p. 179


Steven Johnson, Interface Culture (San Francisco, 1997)
Lev Manovich, 'Navigable Space', navigable_space.doc (4 April 2004)
Kevin Kelly, 'Will Wright: The Mayor of SimCity', wright. html (4 April 2004)
Current Sauce, 'GTA Vice City sells 5 million copies', 07/ Editorials/Grand.Theft.Auto.Vice.City. Hits.Shelves. 5.Million. Copies.Sold-319792.shtml (29 April 2004)
EA Press Release, 'EA Unveils Plans for
SimCity 4 Rush Hour, The First Expansion Pack in the SimCity Franchise', pressrelease _ep1.php (29 April 2004)
ABCNEWS, 'Deadly Inspiration?', sections/gma/US/GMA030905 Grand_theft_murders.html
(29 April 2004)

Density is the problem of architecture. At the moment when our population growth curve begins to soar vertically, the profession intended to manage the organization of that population seems to have disappeared. Architecture today has become a graphic-oriented, expressionist medium. At the very pinnacle of academic discourse one finds minds absorbed with the virtual, the non-existent. The popularity of signature buildings by star architects fuels the egocentric drive of young architects to make their mark. And yet an entire industry continues to make buildings. How is it that, in the face of research and statistics so glaringly apocalyptic, an entire profession can choose not to be involved? What are the circumstances that led to such a position? Is it possible to gain lost ground? Above all else, can we contain a population crisis? The following is an attempt to bridge a piece of high theory with the reality of low practice. Through an examination of written work by Stan Allen and research by Winy Maas, I would like to propose a new tactic that could change the role of architecture. This is a call for responsibility. These two minds have assembled a new method for understanding the urban. This new understanding is not utopian. It is not critical. It is real. It is the acceptance of a given.

In 'From Object to Field' Stan Allen seems to search for a middle ground between the academic and the real. His first move is to align himself with the architect’s duty to context, and the truth of the site. His field conditions “implies acceptance of the real in all its messiness and unpredictability” (Allen 24). And he uses examples of various disciplines as proof of this undercurrent on the rise. In order to qualify as ‘field conditions’ a few general characteristics are listed. That fields are “loosely bound aggregates characterized by porosity and local inter connectivity” (Allen 24) is of key concern in my expansion on his theory. The only difficulty that I have with Allen’s article is an early comment, which seems only to remove him from the responsibilities entangled within the article. He says, “The theoretical model proposed here anticipates its own irrelevance in the face of the realities of practice” (Allen 24). I would suggest that its relevance is the only worth-while purpose for such an investigation. While thorough, I would argue that his analysis misses an important opportunity to discuss real change through an understanding of fields, and how to deal with them globally. In all his examples, no matter what the discipline, field concepts are being applied in the real. While Allen is beginning to explore the potential of critical mass theory as it applies to architecture, one might question, why he does not also discuss its application in the field of real urbanism.

MVRDV is well known as a research oriented studio. Since its establishment, many intelligent projects have been completed, catching quite a lot of attention. In 1999 the statistical analysis METACITY and its hypothetical counterpart DATATOWN were released. The term ‘metacity’ refers to a certain threshold of global communications and its generation of countless relationships. These relationships, in combination with the rapid transportation of goods and services, have advanced major regions of the earth into urban fields. These fields, rather than developing more cores of urban activity, become networks emphasizing the importance of transit rather than accumulation. METACITY/DATATOWN is an attempt to grasp the contemporary city only through data. The general perspective it presents is similar to a theory of field conditions. It’s also a proposal on how to work with the reality of the urban. It is my contention that this research and response package, coupled with Allen’s theory of field conditions, can set a precedent for a new urbanism.

Stan Allen’s investigation of architecture and other disciplines studying field conditions is a quite leap forward from recent discourse. From Object to Field examines art, music, mathematics, architecture and digital media, for field work. His basis in mathematics, and its shift from geometric to algebraic combination, establishes a research agenda as opposed to a critical one. This mathematical distinction is arguably the most important of the article, and is of particular interest in my theoretical expansion, aligning with Maas’s emphasis on statistical analysis. Allen uses the continued construction of the mosque at Cordoba as an example of an architecture being generated as field rather than as object. There are two major points on field conditions that Allen illustrates with Cordoba. First, that a field is inherently expandable, and second, that its overall form is an elaboration of conditions established locally. All of the work done at Cordoba has been a pure repetition of the local scheme no matter what the time period of the work. Simply constrained by the borders of the site, the individual unit of the mosque has been repeated to full capacity. This system of repetition and expansion, what Allen later refers to as generation through a ‘sequence of events’, can be seen in suburban development around cities throughout the United States. The constant building of the suburban condition displays the same method of generation as the expansion of the mosque at Cordoba.

There is a double reading in the suburban sprawl of America, and it is easy to offer an object-focused critique of the built work that continues to appear. All projects commissioned to architects are conceived of individually for a specific client, and produced on a unique site. At this level the industry is still dealing with an architectural object. Allen makes a strong case for the prevalence of this perspective in architecture, by examining theoretical shifts in the visual arts. Beginning with sculpture’s move away from cubist methods and the emergence of minimalism, he demonstrates the continued focus on the object and its physical qualities. “While painting and sculpture have gone beyond Cubism, architecture, I would argue, is by and large still operating with compositional strategies borrowed from Cubism” (Allen 27). With the global popularity of signature buildings, not only is architecture objectified, but the architect is as well.

What if we retreat to a more distant perspective, away from the ego of humanity; is a different reading possible? All new towns seem to grow in the same way, as additions to existing fields, accommodating local constraints, and attempting to associate with a given or desired local identity. All structures are put up in relatively the same manner, at the same pace, everywhere. While zoning ordinances are generated by local authorities to produce desired local effects, overriding building codes seem to level the field, setting up a generic framework for development. What I find extremely interesting is that while the aesthetic of individual projects will vary there is definitely a consistent pattern of growth. This consistency is due to the evolutionary process of urban development. Similar to that of the ‘post-minimalist’ artists Allen speaks of, this process is an extremely complex ‘sequence of events’ that is impossible to control, or predict.

To understand the urban as its own beast is to study it as we do other living things, through data collection and statistical analysis. What would be the difference in an objective rather than critical architecture? The theoretical implications are well understood by Allen, but what are the practical implications? How are we to respond to them? METACITY/DATATOWN is a good example, attempting to make a pragmatic design proposal to address the chaos of contemporary urbanism. In this piece, the real terms of what it means materially, volumetrically, for humanity to use earth, are examined by an architect. While the design proposals tend more toward audacity than plausibility, the raw factual content is staggering, and is meant to instigate a sense of urgency. I propose that this statistical analysis of urbanism be applied as a method of gathering information to formulate a new approach to developing our cities. If, in accepting this new distant perspective of architecture, we can accept the reality of the given, then perhaps architecture can engage with the crucial problems it faces. I am not suggesting that we begin to construct DATATOWN. Its importance is in the organization of its body of information, not the formal result implied.

The field conditions at play in the American city are apparent, but how have they developed? Allen’s understanding of the origins of these conditions is in the initial application of the Jeffersonian grid. He claims its pragmatic and universal characteristics offer an open field, easily organized and developed. Referring to its “intrications and perturbations”, Allen says “pragmatics unpacks the ideality of the grid” (Allen 27). As the needs of those hosted by the system multiply, the system generates new zones of accommodation. These zones are defined by the inclusion of a series of parts, whose connections are possible in a myriad of configurations. Each zone, or suburb, requires a certain number of pragmatic structures in order to operate in conjunction with the existing field. The placement of these units is determined by local issues, and their physical proximity to one another is more or less determined by the existing framework of the field.

How does this field continue to generate itself? If we were to attempt to find a root, what would it be? How does that root instigate growth? Again, by observing the system from a distance, perhaps a series of generalizations could help us to understand the development of the urban. All suburban growth seems to make its first extensions residentially, developing zones to accommodate expanding populations. When the number of outlying zones begins to draw too heavily on the resources of its originating zone, new resource pods are sprouted, diverting traffic, changing dynamics, instigating new configurations, in order to sustain the system. This sounds very similar to the organic patterns of growth found in bacteria. In cases of bacterial infection, a host is occupied, and reproduction occurs until max capacity. In the case of infection, two general options are available. First, the bacteria can be left to breed. When the resources of that host are depleted, the population must either find a new host or die. Alternatively, infection can be treated in order to save the host. In these cases, a counter force is introduced in order to kill off the bacteria. I have chosen to use this crude example in order to show the stark similarity between the tendencies of bacteria and humanity. Just as bacteria, we seem to be unable to recognize the finite characteristics of our host, and continue to support a culture of consumption. Can humanity afford this kind of unbridled expansion? Is it possible for humanity to be critically aware of itself?

Considering that neither extinction nor extermination is attractive, a middle ground must be investigated. One that is acutely aware of humanity, both internally and externally. A balance between these two could produce a new urban strategy that does not give priority to humanity or host. By using a statistical approach similar to that of Maas’ METACITY, we might wholly understand the scope of suburbanization. By mining the urban field for statistics, its true impact can be understood as pure, raw data. Census data, for example, could be of a huge benefit, in that it is location specific. From these types of data sets, information fields can be projected across regions, explaining demographics, densities, economics, social norms, and consumption patterns. Allen speaks of a “thickening of the surface” as an explanation for zones of urban intensity; not developing individually, but as abstract figures from the field itself. Moire patterns can demonstrate such behaviors, and Allen makes a good case for this. The application of such ideas to the urban is very interesting. If the moire occurs by the combination of two regular fields, and we can produce these through mathematics, then what happens if the expressions for geometric patterns are switched for that of urban statistics? What vortexes would these types of combinations produce? What could we learn from them?

In order to manipulate a field, we must first grasp the dynamics of that field, and the relationships that sustain it. There is no doubt Allen understands the flock phenomenon. As he reviews in his piece, this kind of behavior has been studied in science and mathematics for nearly twenty years. In all cases it can be reduced to a short set of rules governing local connections, similar to those originally conceived of by Craig Reynolds. Allen does go as far as to suggest that human crowds display the same behavior, but recognizes the heightened complexity. Still he refers to Elias Canetti’s Crowds and Power to support the connection. “According to Canetti, the crowd has four primary attributes: the crowd always wants to grow; within a crowd there is equality; the crowd loves density; the crowd needs a direction” (Allen 29).

What is it that gives a crowd, a herd, or a school it’s direction? In observing a cowboy work his cattle, some potential lessons might be learned. How do herds react to force? In the case of urbanism, could we herd populations? What sort of stimuli would be effective? Broad statistical research could lead to ideas on how to deal with the urban condition on a scale fitting of its impact. If the contemporary American city is considered as a growing, self-referential field, with its own rule set for continued expansion, then it can be understood that individual injections of change are accommodated, but will not instigate an evolutionary shift in the system, because local difference is allowed in field organizations in order to maintain the stability of the whole. A new tactic is necessary to achieve real influence on contemporary urban fields. By adopting a herding strategy, architecture might just round up the urban, and organize it into more sustainable patterns of development. Either by introducing minor changes to the local system constraints evenly across the entire field, or by finding the right pressure points and applying substantial force, global difference can be actualized. Based on evolutionary models, the field should initially absorb such forces, and subsequently reconfigure in order to survive. I would contend that the scale of implementation is the key to change.

Allen’s conclusion is to propose a ‘logistics of context’. He claims a well-developed theory of field conditions is the way to rise out of the debate over how to deal with context, by “acknowledging the distinct capabilities of new construction, and at the same time recognizing a valid desire for diversity and coherence in the city” (Allen 30). I agree completely with this, but there is another question subtly avoided; that of the potential growth of the role of the architect. Allen calls us to “recognize the limits of architecture’s ability to order the city”. I ask the opposite. The majority of contemporary American architects produce single-structures for either private or corporate clients. Higher education produces thousands of these professionals to satisfy the demand of our exploding population. But who is managing the growth globally? Architects need to step up to their professional responsibilities and begin to design projects capable of accommodating the true breadth of humanity. Perhaps the industry could split in two. The majority of its workforce continuing to manage the construction process, while another group would adopt the massive organizational role of herding the urban into a manageable body.

This new role could become a true contemporary professional urbanism, taking on global organizational issues. If the urban is looked at from a scientific or natural perspective, then its relationship to its host can be managed in a responsible way. I am proposing an entirely new scale of design project, similar to the mega structure, but in a fluid, diagrammatical sense. Not permanent structures to be designed and built, but organizational structures operating as constraints on large regions of the urban field. These large projects could develop through statistical analysis, producing intelligent and adaptable containers and networks, as well as grafting major natural regions onto existing inefficiently organized zones. The aim would be to bring the urban back into balance with the natural. Many of these types of strategies are being considered by MVRDV in projects such as Pig City, Container City, Silodam and the Brabrant Public Library. These projects are all at a level of density well beyond any contemporary city, and they process massive statistical relationships. I think that these kinds of projects can be realized in the United States. If American architects adopt a herding strategy, based on a view of the urban examined here, they could regain a certain leverage on the city, and put an end to the unrestrained consumption of suburban expansion.


MVRDV, Metacity/Datatown (Rotterdam, 1999)
Stan Allen, 'from Object to Field', Architectural Design: After Geometry (London, 1995), p. 24-31


A cloud is neither a singular object, nor a mass of individuals, but a mapping of kinetic forces and their residual effects on a network of interconnected particles.

In order to recreate the types of behaviours observed in clouds, a time-lapse photographic study was used to record the movement of singular bodies, on individual trajectories, through a designated field. While each of these paths has its own particular properties, our interest was focused on the disturbance they caused within the field they encountered, and the accumulation of multiple disturbances over time. Through digital mapping techniques we were able to analyze the various trajectories and their relationships. Although the initial footage showed no signs of a collective, directed movement, these tendencies became evident as each path was overlaid onto a single diagram.

The subsequent modelling exercises were attempts to investigate this collective movement as a series of vectors through various types of fields. In each model, a regular field was introduced to outside forces that displaced its organization. In some cases that reverberation was only a temporary disturbance; in others it left a lasting impression.

Another consistent change in the development of our cloud models was the relative connection between each of the particles. The more connections that exist in a group, the more that group will behave as one body, or allow the transfer of forces from one individual particle to another. Each model included an effort to increase the amount and complexity of the network of voxels, in order to facilitate the transfer of forces between them.

The different types of clouds we created were composed of unique materials, which generated a specific set of outputs. While some fields had elastic tendencies, and therefore quickly returned to an equilibrium state, others had attributes that allowed more residual effects to occur. In all cases the field disturbance was unique to the variables presented by the body/vector and particle/field. Each experiment had a specific set of inputs that were regulated in order to allow for the extraction of quantifiable data from the system.

In order to address the specific paths diagrammed through our time-lapse recording procedures, the vectors were introduced to each system in opposition to the regularity of the field. Although the components of our machines varied in each case, the established method was the same, to introduce a conflicting vector to a regular field and observe the resulting disturbance within that field.



In this workshop, we used the data sets of workshop 1 as input for a digital cloud machine. We established a series of rules that informed the development of this machine, and then subjected it to a series of vector-based input sets. These pedestrian vectors, from video of south bank centre, were projected into the third dimension and set into motion.

Our intention was to subject the machine to such data and, through the establishment of certain types of scripting, convert that input into an output set offering both quantitative information and qualitative, sensorial, effects. These effects include, but are not limited to; changes in transparency, light, form, etc.

By reviewing the video of south bank centre, and mapping the recorded pedestrian activity, a 2d diagram was created in workshop 1 to analyze patterns of moment and their effect on a surrounding volume. In order to articulate the third dimension of our test volume, these vectors were extruded. By creating a time/distance graph, the vectors were set to a dimension of 1 per frame of condensed video.

This data was brought to inform a concept of the generation of emergent properties of the relationship between vector and volume. Our interest was focused on light, transparency & movement as global emergent properties of the field. The selection of the above qualities was based on their ability to enhance or modify the ambient properties of an environment. When translated to the local level this meant that each individual component of the field did not have to materially emit these qualities, but that their communal behaviour would modify the environment within which the field existed.

Equilibrium within the organization of the particles allowed for a specific examination of the effect of the test vectors on the field, maximizing locatable and measurable differences. These homogeneous units, arranged within a specific grid, are given simple, local rules of constraint. There is an absence of local hierarchy or control, and the mathematic relationship between the activator and charged particle is designed to emphasize the unrestricted nature of 3dsplines against the rigidity of the grid. Exponential growth through the multiplication of input also occurs, rendering ultra-intense zones of activity



Can the diffuse qualities of a cloud be simulated? Is it possible to control and or manipulate the constant reconfiguration of a field of particles in order to produce a range of ambient qualities?

The gradient of transparencies produced by the WS2 digital cloud instigated a series of material studies in WS3, aiming for a controllable and modifiable field of particles, that would adjust its global ambient qualities of light and transparency based on the kinds of local relationships composing that field.

Our material studies included inflatables, elastics and fabrics, each contributing their own characteristics to our study of controlled ambience. The following catalogue of images explains our process of investigation, and what characteristics were influential in developing the final field.

In all cases our material studies seem to emphasize the relationship of field expansion to degree of transparency. For instance, the expansion of an elastic balloon effectively thins the material, displacing its particular make-up. When this happens, light is allowed to penetrate the material. In the case of fabrics, when a weave is opened the same result occurs.

In order to produce a field that could offer such effects on queue, and in specific locations, a series of stitching patterns were investigated for a networking pattern that would allow rigidity in organization and flexibility in location. The end result was a field of discs, composed in four layers with various stitching patterns allowing for different degrees of mobility. These layers were stitched together in all three dimensions, and connected to series of controls. These controls allow for the implementation of specific forces that reconfigure the field. Ultimately these forces are indefinitely registered within the field, producing a field of adjustable density that, when subjected to a light source, offered a modifiable diffusion gradient.