"The Virtual Reef is a life–sized marine ecosystem expanding across two levels of the new Science and Engineering Centre. Multi–touch technologies enable the user to manipulate, intimately explore and interact with the reef world, specific behaviours and relationships.
Australia's leading marine science and interactive and visual design organisations, QUT and the Queensland Museum, bring knowledge and research of the underwater world to your fingertips through multi–touch screens and projectors.
Users will have the opportunity to go beyond the cinematic experience and interact with the marine world. Each interaction has associated content designed to complement the aims of the National Curriculum and provide an exploratory learning experience."
(Jeff Jones, the Cube, QUT)
Fig.1 "The Virtual Reef" project team: Professor Jeff Jones (Cube Project Leader), Associate Professor Michael Docherty (Project Leader), Warwick Mellow (Principal Animator/Art Director), Joti Carroll, Paul Gaze, Sean Gobey, Ben Alldridge, Sophia Carroll, Sherwin Huang, Bryce Christensen.
"As it happens, designing Future Interfaces For The Future used to be my line of work. I had the opportunity to design with real working prototypes, not green screens and After Effects, so there certainly are some interactions in the video which I'm a little skeptical of, given that I've actually tried them and the animators presumably haven't. But that's not my problem with the video.
My problem is the opposite, really – this vision, from an interaction perspective, is not visionary. It's a timid increment from the status quo, and the status quo, from an interaction perspective, is actually rather terrible. ...
I'm going to talk about that neglected third factor, human capabilities. What people can do. Because if a tool isn't designed to be used by a person, it can't be a very good tool, right? ...
Do you see what everyone is interacting with? The central component of this Interactive Future? It's there in every photo! That's right! – HANDS. And that's great! I think hands are fantastic! Hands do two things. They are two utterly amazing things, and you rely on them every moment of the day, and most Future Interaction Concepts completely ignore both of them. Hands feel things, and hands manipulate things.
Go ahead and pick up a book. Open it up to some page. Notice how you know where you are in the book by the distribution of weight in each hand, and the thickness of the page stacks between your fingers. Turn a page, and notice how you would know if you grabbed two pages together, by how they would slip apart when you rub them against each other.
Go ahead and pick up a glass of water. Take a sip. Notice how you know how much water is left, by how the weight shifts in response to you tipping it.
Almost every object in the world offers this sort of feedback. It's so taken for granted that we're usually not even aware of it. Take a moment to pick up the objects around you. Use them as you normally would, and sense their tactile response – their texture, pliability, temperature; their distribution of weight; their edges, curves, and ridges; how they respond in your hand as you use them.
There's a reason that our fingertips have some of the densest areas of nerve endings on the body. This is how we experience the world close–up. This is how our tools talk to us. The sense of touch is essential to everything that humans have called 'work' for millions of years.
Now, take out your favorite Magical And Revolutionary Technology Device. Use it for a bit. What did you feel? Did it feel glassy? Did it have no connection whatsoever with the task you were performing?
I call this technology Pictures Under Glass. Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade."
(Bret Victor, 8 November 2011)
"The IRC brought together researchers from eight different institutions and a variety of disciplines which address the technical, social and design issues in the development of new inter–relationships between the physical and digital.
A series of experience projects engaged with different user communities to develop new combinations of physical and digital worlds and explore how these may be exploited and how these may enhance the quality of everyday life.
A series of research challenges explored (a) new classes of device which link the physical and the digital, (b) adaptive software architectures and (c) new design and evaluation methods, which draw together approaches from social science, cognitive science and art and design. Equator involved over 60 researchers, with a range of expertise encompassing computer science, psychology, sociology, design and the arts.
Equator aimed to forge a clearer understanding of what it means to live in an age when digital and physical activities not only coexist but cooperate. This is the age we are now entering, and it promises radical change in how we communicate, interact, work and play–that is, how we live. But to fulfil that promise requires more than new technology. We need equally new ways of thinking about technology, and thus also about ourselves.
Everyone recognises that the computer is moving beyond the workplace. As digital systems (like the Web) converge with computer networks and cellular phone communications, new devices and services proliferate–many of them mobile, or embedded in the environment. Yet few people fully grasp the potential impact of such technological fluidity and ubiquity. Most current research is still rooted in the workaday world of the desk–bound PC. But look at the possibilities–for our home life, our schooling, community care, even our city streets.
These are just some of the areas which Equator explored, through the development of coherent new systems and devices. Ultimately, however, we were less concerned with solutions to specific design problems than with the bigger picture these solutions entail. This is what united so diverse a community of researchers. For it is only by sketching the bigger picture that we can begin to fulfil the promise offered by our new age, and so improve the quality of everyday life in years to come."
London 22–25 September 2011: "Alpha–ville festival explores the intersection between art, technology and society and for this edition we are collaborating with various venues and spaces in London such us The Victoria & Albert Museum, Whitechapel Gallery, Rich Mix Cultural Foundation, Space Studios, Vortex Jazz Club, Netil House, XOYO and Hearn Street Warehouse to bring along an extensive 4–day event featuring social media art, kinect art, interactive installations, open labs, workshops, performances, screenings, live music & A/V shows, a one–day symposium and more!
The 2011 edition provides an online and live platform to explore, test and disseminate new ideas, emerging trends, collaborations and groundbreaking works. Running from 22–25 September and taking place alongside the London Design Festival, the 2011 edition enables a network of satellite events spreading across different London boroughs and links with other European cities such as Madrid (Twin Gallery) and Brussels & The Hague (Todays Art).
The festival programme also connects east and west London thorough a link with the V&A Digital Design Weekend."
"Openframeworks is a c++ library designed to assist the creative process by providing a simple and intuitive framework for experimentation.
The library is designed to work as a general purpose glue, and wraps together several commonly used libraries under a tidy interface: openGL for graphics, rtAudio for audio input and output, freeType for fonts,freeImage for image input and output, quicktime for video playing and sequence grabbing.
The code is written to be both cross platform (PC, Mac, Linux, iPhone) and cross compiler. The API is designed to be minimal and easy to grasp. There are very few classes, and inside of those classes, there are very few functions. The code has been implemented so that within the classes there are minimal cross–referening, making it quite easy to rip out and reuse, if you need, or to extend.
Simply put, openFrameworks is a tool that makes it much easier to make things via code. We find it super useful, and we hope you do too.
OpenFrameworks is actively developed by Zach Lieberman, Theodore Watson, and Arturo Castro, with help from the OF community. ofxIphone, is actively developed by Mehmet Akten and Zach Gage, with development help from Lee Byron and Damian Stewart. The OF website is designed and maintained by Chris O'shea.
OpenFrameworks is indebted to two significant precursors: the Processing development environment, created by Casey Reas, Ben Fry and the Processing community; and the ACU Toolkit, a privately distributed C++ library developed by Ben Fry and others in the MIT Media Lab's Aesthetics and Computation Group."