"Physicist Jim Al-Khalili travels through Syria, Iran, Tunisia and Spain to tell the story of the great leap in scientific knowledge that took place in the Islamic world between the 8th and 14th centuries. Its legacy is tangible, with terms like algebra, algorithm and alkali all being Arabic in origin and at the very heart of modern science – there would be no modern mathematics or physics without algebra, no computers without algorithms and no chemistry without alkalis.
He discovers how medieval Islamic scholars helped turn the magical and occult practice of alchemy into modern chemistry and argues that these scholars are among the first people to insist that all scientific theories are backed up by careful experimental observation, bringing a rigour to science that didn’t really exist before."
Call for Participation – Digital Methods Summer School 2014, On Geolocation: Remote Event Analysis (Mapping Conflicts, Disasters, Elections and other Events with Online and Social Media Data), 23 June – 4 July 2014
"The Digital Methods Initiative is a contribution to doing research into the 'natively digital'. Consider, for example, the hyperlink, the thread and the tag. Each may 'remediate' older media forms (reference, telephone chain, book index), and genealogical histories remain useful (Bolter/Grusin, 1999; Elsaesser, 2005; Kittler, 1995). At the same time new media environments – and the software–makers – have implemented these concepts, algorithmically, in ways that may resist familiar thinking as well as methods (Manovich, 2005; Fuller, 2007). In other words, the effort is not simply to import well–known methods – be they from humanities, social science or computing. Rather, the focus is on how methods may change, however slightly or wholesale, owing to the technical specificities of new media.
The initiative is twofold. First, we wish to interrogate what scholars have called 'virtual methods,' ascertaining the extent to which the new methods can stake claim to taking into account the differences that new media make (Hine, 2005). Second, we desire to create a platform to display the tools and methods to perform research that, also, can take advantage of 'web epistemology'. The web may have distinctive ways of recommending information (Rogers, 2004; Sunstein, 2006). Which digital methods innovate with and also critically display the recommender culture that is at the heart of new media information environments?
Amsterdam–based new media scholars have been developing methods, techniques and tools since 1999, starting with the Net Locator and, later, the Issue Crawler, which focuses on hyperlink analysis (Govcom.org, 1999, 2001). Since then a set of allied tools and independent modules have been made to extend the research into the blogosphere, online newssphere, discussion lists and forums, folksonomies as well as search engine behavior. These tools include scripts to scrape web, blog, news, image and social bookmarking search engines, as well as simple analytical machines that output data sets as well as graphical visualizations.
The analyses may lead to device critiques – exercises in deconstructing the political and epistemological consequences of algorithms. They may lead to critical inquiries into debates about the value and reputation of information."
"I am going to argue that 'media independence' does not just happen by itself. For a technique to work with various data types, programmers have to implement a different method for each data type. Thus, media–independent techniques are general concepts translated into algorithms, which can operate on particular data types. Let us look at some examples.
Consider the omnipresent cut and paste. The algorithm to select a word in a text document is different from the algorithm to select a curve in a vector drawing, or the algorithm to select a part of a continuous tone (i.e. raster) image. In other words, 'cut and paste' is a general concept that is implemented differently in different media software depending on which data type this software is designed to handle. (In Larry Tesler's original implementation of the universal commands concept done at PARC in 1974–5, it only worked for text editing.) Although cut, copy, paste, and a number of similar 'universal commands' are available in all contemporary GUI applications for desktop computers (but not necessarily in mobile phone apps), what they actually do and how they do it is different from application to application.
Search operates in the same way. The algorithm to search for a particular phrase in a text document is different than the algorithm that searches for a particular face in a photo or a video clip. (I am talking here about 'content–based search,' i.e. the type of search which looks for information inside actual images, as opposed to only searching image titles and other metadata the way image search engines such as Google Image Search were doing it in the 2000s.) However, despite these differences the general concept of search is the same: locating any elements of a single media object–or any media objects in a larger set–to match particular user–defined criteria. Thus we can ask the web browser to locate all instances of a particular word in a current web page; we can ask a web search engine to locate all web pages which contain a set of keywords; and we can ask a content–based image search engine to find all images that are similar in composition to an image we provided. ...
Against these historical developments, the innovation of media software clearly stands. They bring a new set of techniques which are implemented to work across all media. Searchability, findability, linkability, multimedia messaging and sharing, editing, view control, zoom and other 'mediaindependent' techniques are viruses that infect everything software touches–and therefore in their importance they can be compared to the basic organizing principles for media and artifacts which were used for thousands of years."
(Lev Manovich, 2013, pp.113–124)
Manovich, L. (2013). "Software Takes Command", Continuum.
"So, working with Bentley and MicroStation, we've written a custom piece of software that networks all of the components together into these chunks of information, so that if we change any element along the length of the building, not only does that change distribute through each one of the trusses, but each one of the trusses then distributes that information down the length of the entire facade of the building. So it's a single calculation for every single component of the building that we're adding onto. So, it's tens of millions of calculations just to design one connection between a piece of structural steel and another piece of structural steel. But what it gives us is a harmonic and synthesized relationship of all these components, one to another.
This idea has, kind of, brought me into doing some product design, and it's because design firms that have connections to architects, like, I'm working with Vitra, which is a furniture company, and Alessi, which is a houseware company. They saw this actually solving a problem: this ability to differentiate components but keep them synthetic. So, not to pick on BMW, or to celebrate them, but take BMW as an example. They have to, in 2005, have a distinct identity for all their models of cars. So, the 300 series, or whatever their newest car is, the 100 series that's coming out, has to look like the 700 series, at the other end of their product line, so they need a distinct, coherent identity, which is BMW. At the same time, there's a person paying 30,000 dollars for a 300–series car, and a person paying 70,000 dollars for a 700 series, and that person paying more than double doesn't want their car to look too much like the bottom–of–the–market car. So they have to also discriminate between these products. So, as manufacturing starts to allow more design options, this problem gets exacerbated, of the whole and the parts."
(Greg Lynn, February 2005)
This is a useful initiative, despite its narrow focus on engineering and science. It would be great to see the companion film which profiles creative arts and design professionals who regularly use programming as part of their practice/work.