CHECK THE UPDATED PAGE AND NEWS ABOUT THIS PROJECT HERE:visit and have a s:o:u:p!

Jetzt Schemata:

THIS PROJECTS PAGE HAS BEEN CONTINUED HERE:
visit and have a s:o:u:p!

Go into the city (walking, by car, skating,...) track down its wireless networks, "private" and "public", "open" or "closed" Map them out from the traditional 2D representation of street-crossings and building patterns. Map them into real-time sound and 3D objects generated-animated by the data streamed out from G.P.S, Wireless and other location aware devices.

There is always a sense of spatio-temporal immersion when in the city, the physical surroundings, sounds, smells, colors and people around simple embrace us. One can approach them selectively, like when taking a closer look into the detail, the tone or the tact. The way one recognizes those marks that make somehow unique each place/space we dwell are what we could define as singularities, they will be the source material we will search for in an attempt for sound and 3d representation. We all know the city is built upon layers of data, meta-constructs, that are named, "owned", and not always read or recognizable by humans. We think of them as the source for a schemata that gives shapes and tones to geopostition signals and surrounding airwaves. The city just needs a script, an instrument, to interpret itself. This that is what we are working on.

Mapping has been always a task for the military, the urban planner or the cartographer. Today we see the uprising of many new ways for that practice we call mapping, we are interestes in mapping the digital territories we cross-inhabit.

We are now subsumed in a compacted reality that can be unfolded by new ways or representation.

How to map life in the mixed realities we live in? where a blend of public to virtual; remote to private is always colliding with one an-other just cause of the gracious and fortuitus.

 


Our project "Jetzt Schemata" is based on some code that we are developing using the OSC open sound control communication protocol to interface wireless, sound and visual(openGL) applications such as: Kismet, GPSDrive, PureData, Fluxus, MaxMSP and all other that can open a port for receiving or sending data via OpenSoundControl aka OSC.


LINK TO A DOWNLOADABLE VERSION OF KISMET2OSC:
check in S.O.U.P

DESCRIPTION
k20 is a bridge between the wireless network detector kismet and the open sound
control protocol. currently it consists of two programs: k2orec acts as a
kismet client. it connects to a kismet server via tcp and logs the output to a
textfile. the file consists of a timestamp and an osc-message per line. k2oplay
reads this file and sends the messages to the specified osc-targets. k20 is
written in c/c++ and only depends on liblo, the lightweight OSC implementation.
k20 runs on macos x (see BUGS in the README file) and linux.

IT WILL RECORD GPS AND WIRELESS DATA TO A .TXT FILE.
IT WILL ALSO SEND THIS DATA IN REALTIME TO ANY OSC PORT.
THE SAVED .TXT FILE CAN BE REPLAYED WITH k2oplay SO TO BE
USED BY SEVERAL DIFFERENT APPLICATIONS THAT SUPPORT OSC.

Send comments and suggestions to:
lorenz schori loATznerol.ch or alejoduqueATgmail.com

 

project by:
lorenz schori (switzerland)
I'm continuously developing ways to join my two main interests music and computer programming, exploring new possibilities off the beaten path. i'm looking forward to my studies at the jazz school lucerne, where i will refine my skills in both of those worlds.

alejo duque (kolumbien)
I'm actually doing research on the topic of smuggling ideas for a PhD dissertation at www.egs.edu , its all based on the "hacks" that the resourceful mind can do of all kind of new and old technical/conceptual devices. (very often the "good" examples are found under criminal records and drug trafficking techniques). With the computer i pay good interest to the concept of networking and i am with lorenz trying to develop a way to interface place, location and trajectory with audiovisual representations...so called "real-time mapping".