Saturday, 13 June 2015

Week 10

Who is the moderator of style? Architects strive to uncover the next ism within their work, like Mies is to modernism, is ZHA to Parametricism? Neil Leach aggressively deliberates Patrick Schumacher’s branding of parametricism as the new style, by first outlining the divide between; parametric (software based incremental manipulation of design aspects or parameters, which intern manipulate the entirety of the assembly) and algorithmic design (the use of scripting language, allowing the designer to invert the user interface and design through the direct manipulation of code, not form) stating that one should not become confused between the two and a clear distinction must be maintained. To then reconcile that the parametric and algorithmic are intrinsically dependant “we now find ourselves in a dialectical situation where code and form rely upon another.” Leach then gives expression to the use of Parametricism as title, not function for the new order of style.

How does a digital age, where the use of ‘the cloud’ which dictates a disillusion between reality and cyberspace, manage to establish architectural relevance when computation jargon usurps the established principals of architecture (form, structure, program, etc…)? John Frazer hypothesises that the digital realm enhances sensory and emotional response in the physical realm to an extent where the human condition is a hybrid cyborg of blood flowing through coaxial cables. Inevitably, Frazer draws conclusion that due to the linear processing nature of computerisation, the emphasis of architecture has shifted from product to process, highlighting an algorithmically based design process where code dictates geometry. However, this hybridised state must give way to the human within. Processing is made palatable through parametricism, as the rational of beauty and style take over from the cyber-centric object, devoid of reality, gravity or taste.


I come back to the idea that Parametricism is the new style; I am torn between Schumacher and Leach. I do think that the new form of curvilinear, data based design is the forefront of our digital age as we know it, however if the form is not being designed but the process has now become the design. Are we creating a new style or a new process? Can this then become quantifiable as shape, or is the constructs of ism’s ideologically centric? All I can confidently conclude is that Microsoft word is yet to catch up on the ‘new style’ as red squiggles suggest the inconclusiveness of parametricism as the new style, or maybe I just spell it wrong.

Week 9

Does the use of non-standard construction contribute to a waste society of consumption? Holden Pasquarelli puts forward the idea, no, attitude of versioning. Versioning is a term used to describe the shift from traditional modes of thinking, expanding knowledge to explore the potential effects of design. It allows architects to appropriate different practices and ideologies from a variation of disciplines – film, food, finance etc. to help solve a given problem. This attitude becomes a platform for extracting tactics seen in other streams (advertising, medicine, science, film, et.) into the frame of architecture (or visa-versa). This can be made palatable as a catalogue of standardised components that can be mixed and matched in response to a site specific condition. So then where does the role of the designer/architect fit in? In this world of computational design, individualised and highly varied works are becoming cheaper and buildable as technology provides the ability to quantify and manufacture components, with accuracy and with minimal waste. Is this attitude of versioning just an appropriated commercialism?


The established practice of architecture fractured the architect from the construction process, resulting in in-flexibility throughout the construction process. The attitude of versioning aims to rectify that through standardisation, by contrast the use of digital fabrication tools such as cnc-mills, laser cutters and 3D printing aim to engage all parties (architect, engineer, fabricator and client) in the fluid process of form finding. Additionally, these methods are highly tailorable and have minimal wastage, thus being cost efficient within the construction process. This approach is taken by Frédéric Migayrou, as he views “the  use of digital fabrication as liberation for the non-standard”.

Week 8

Surface or facade have become the ‘it’ words of the 21st century architectural diaspora. Often viewed as a quick fix or a superficial ornament of design, it is the latest gentrification of the architectural toolset used to patch the realm between structure, form and the outside world. Antone Picon subdivides digital architecture’s surface fascination into two discourses; “surface bear more immediately the mark of formation processes than volumes… surfaces appear as more genuine expressions of parametric variations.” And “surfaces challenge the traditional mode of presence of architecture as well as some of the fundamental binary structures that have characterised the discipline for a long time.” Picon highlights the tactility of surface as the point of contact between human and building, defining the surface as a shift in tectonic based form in an imaginative and variable way, appealing directly to the senses, leaving the subject unclear where his sensitive body ends and from where exterior reality truly takes over, in a manner that would make Bruce Willis implode due to sensory over-saturation.


By extension, Ronald Snooks dives into the realm of collective intelligence as data driven systems gain traction as determination within autonomous architectural and computational agents. The use of computation can now foster the emergence of “form emerging from the interaction of localised entities within a complex system.” Snooks applies a notion of localised scale to surface treatment through the use of swarm matter and woven composites as structural precursors to a logical system, demonstrating a shift from uniformity to an emergent assemblage dictated by population interaction.

Week 7

Materiality is a polarising aspect of the architectural practice, viewed either as superficiality or reverential. Antone Picon analyses how the industrial era had fractured the importance of materiality as the use of steel became widespread, resolving that only now are we “returning to a conception closer to the pre-industrial one, with all the researchers on composite and smart materials and the tendency to solve more and more problems at the level of material design rather than structural design.” This statement is mildly contradictory as it states that smart materials are not in the same realm as structure, I would argue that materiality is the same architectural theology as structure, however on a micro or nano scale. This new manipulation of material is advanced through computation as there no finite scale within the digital realm; this gives the ability to view molecular material structures at the scale of the macro.

In conjunction with Picon, Michael Weinstock begins to analyse the importance of simple and complex polymers and the advances of materiality in the near future. He believes that the use of composite or ‘smart’ materials enhance the treatment of structure as an informing principal of design. Soap bubbles become his crutch to logically demonstrate circle packing or voranoi techniques, taking mathematical principals to derive tangents of best fit for non-tessellating or radial differing shapes. Weinstock states “Form, structure and material act upon each other and this behaviour of all three cannot be predicted by analysis of any one of them separately.” He speaks of a hybrid unknown composite that is the result of form, structure and material, like outlined in an earlier theoretical discussion about Picon, he believes in a more socialist process of design where each small micron of matter plays its small part in an overall product. This is an embodiment of materiality as extension of structure and how a new materiality must be smart in its discourse to ‘play its part’ what will be the overall design, not as a trivial facade. 

Friday, 12 June 2015

Week 5

Thomas Hughes attempts to curate a debate as response to technology as machine through a critique of intellectual positions from noteworthy industrial thinkers of the 19th and 20th centuries. What is clear within the reading is the divide between European (specifically German) and American approaches, this leads to a right-centric debate between fascism and capitalism highlighting the devoid of natural principals within industrialisation. Through Leon Trotsky, Hughes establishes the displacement of divinity within the realm of industrial engineering, “in the 19th century Americans had created a human-built world, believing that the creative act depended upon a God-given spark and that they were completing god’s creation. In the 20th century, god was no longer needed.” Fresh out of the war and deeply rooted within the boom of the second industrial revolution it is viewed that cities were now dictated by man and the ability to mass produce standardised forms of technology. This lead to standardised mechanic designs, that were abundant and cheap to manufacture, resulting in a rigid puzzle of mechanisation.

By extension, Werner Sombart believed in a fascist implementation of a global machine as alternative to capitalism. By means of cheap labour from developing countries and management structures of Eurocentric corporations, he believed that the machine empowered humans to break free of their natural limitations.


By contrast, Charles Beard rediscovered the beauty of the natural and warned of the ignorance towards nature through the furtherance of the machine. He notes that Americans only find solace in the quantitative not the qualitative, by extension they have become the machines, focused on the numerical data as determination for success. This notion is surprisingly echoed by Oswald Spengler as he notes the “cultural sea of change, occurred when humans began using technology to exploit nature”.

Sunday, 7 June 2015

Week 4

Technology theorists frequently quote systems theory, which defines a conglomerate as a property of entities which cannot be derived from the sum of its parts. Michael Hensel, Achim Menges & Michael Weinstock (HMW) derives the concept of emergence from systems theory, highlighting the gap or lack thereof, between the natural and the mechanic. HMW introduces self-organising bodies as a form of morphogenesis as key elements to the advancement of computational design, devoid of focus towards form (blatantly), they contextualise material performativeness as self-evolving resulting in a form that has been ‘settled into’. “Performativeness is the quality of material systems that perform through deformation, or which visibly deform to self-organise and resist new external forces”. This is a holistic approach to architecture as a system where each element is a necessary cog in the larger machine (or should I say automaton). I struggle to grasp HMW’s true furtherance of computational theory and practice, on one hand they boldly state that design should be self-generative however they then backpedal to the origins of architectural theory and suggest that a building it like a system, a sum of its parts. To then describe computational design as a quantifiable method to system performance, to the reader this again subsides into the realm of BIM, a great tool however it is not a computational approach, just a computerised approach.


Inclusively, Christopher Hight & Chris Perry (HP) annotate the notion of Picon’s already existing cyborg within the realm of the digital age, advanced though liberating technological communication platforms. “Machines and technologies that are an extension of that social body, one cannot differentiate practice from product, or the notion of the human or social from the technological or the natural.” The creation of web 2.0 cast the foundations for information exchange, liberating intelligence from the centralised being to that of the collective cloud, again misguidedly building upon the advances of BIM, convolutedly shadowing as a computational design tool. There is no doubt that collective intelligence is a large driver in the cabin of the architectural profession (any profession for that matter) however, it is that oversized, hairy and sweaty driver that is commonly mistaken for the contents of the haul, the real fusion of the natural and mechanic has not been found, discussed or even pondered to the extent of other academics such as Picon or Menges who have truly identified the differences between computation and computerisation; where one is a deduction of values as opposed to a compilation of values.

Week 3

Discourse, as an extension of the human condition, is an ever expanding aspiration. Like methamphetamine to a junkie, humanity is in a constant state of rationalisation, determined to establish coherence within a multiplicity of material, we always need our fix. Michael Foucault offers up several themes as pathway to discursive enlightenment; principally aroused by “the interplay of the rules that make possible the appearance of objects during a given period of time” Foucault draws a parallel between the use of displacement and madness as catalyst to discourse. Once subjects can effectively segment, the intrigue is how one can categorise and individualise the coexistence of these objects, highlighting the system of division, parallel dependency, morphogenesis and they interaction of location, arrangement and replacement of the established heterogeneous.

By comparison, Sean Ahlquist and Achim Menges draw conclusion that parametricism as result of mathematical discursion is the computational embodiment of morphogenesis. Given expression through “Form… is a system which organises itself in the presence of both internal and external forces, and that the organisations can shape patterns traced through mathematical rules.” Ahlquist and Menges clearly mirror Foucault’s notion that the interplay of rules define the appearance of objects. Both texts also define the importance of context in the processes of classification, however Ahlquist and Menges approaches the topic from a less anarchic stance, evaluating where the role of the designer or orchestrator would become apparent in a world of self-defining, morphic computation. Stating that the designer is now the author of the rule-set as descriptions towards the development of the form, they position the architect as a more philosophical role, endowed with the power of selection and argument generation in “both a technical computational manner and a theoretical material vain.”


Academics revel in the anarchic upheaval of established discourse, drawing parallel between factions, however Foucault beautifully grounds the temptations of theoretical inflation, poetically noting that “the rules of formation are conditions of existence in a given discursive division.” Like the built environment and other established social structures, we must be wary of geo-divisional morphogenesis and note similarities, differences, and interactions within the micro-environment withholding the information that we seek.