Tuesday 20 December 2011

Start Animating..

I start worked out on my interface animation even though the programming part for my project is still floating at somewhere.*=_____=!!!* I was masking out all the graphic one by one using Adobe flash. At the same time, I continue research for the scripting and hope I will get it before new year. So that, I could proceed to the next stage.

Friday 16 December 2011

Interface Refine...

After the tutorial session, lecturer suggested me try to explore on different style and colour for my interface. So, after several time of testing *@___@...eye blurring after keep on try out on different colour and style*Finally....Here it goes..My final interface:


Before
Final Interface


Monday 12 December 2011

Research again & again...

Throughout the research, I found a project that have similar execution way as mine which is "The Bird Table" by Norman Lau. The information is very useful to me where it has clearly stated out the way of construction for mechanism, programming and execution method. However, in this reference, he is using reacTIVision as a cross platform. 


reacTIVision
It is an open source, cross-platform computer vision framework for the fast and robust tracking of fiducial markers attached onto physical objects, as well as for multi-touch finger tracking. It was mainly designed as a toolkit for the rapid development of table-based tangible user interfaces (TUI) and multi-touch interactive surfaces.


"The Bird Table"
http://www.normanlaudesign.com/projects/software/birdtable_book.pdf
The idea of the Bird Table is a simple, playful app that let’s multiple people
interact with a visualization of birds. Utilizing the object recognition capabilities of a multi-touch table, users can manipulate properties like color,
scale, and number of birds.

Tuesday 6 December 2011

Meeting Jazmi...

I was meeting Jazmi this afternoon for further understanding on the interactive tangible interface. Eva and Evie were joining the testing session as well. We helped Jazmi set up the mechanism using the recycle material that leftover by the student after their project presentation (e.g. a wooden cabinet). At first, we encounter a problem which is the projector could not perfectly projected the screen onto the glass because of the limited space in the wooden cabinet. We were then tried to change to position of the projector and at last we get to solve the problem. After that, we set up the PS3 eye and we found out that it can only captured the images on small surface. Therefore, Jazmi connected two cameras together to captured the whole screen. It seems successful but it is quite hard to perfectly merge the images that being captured by the two cameras together. However, we still continued test out on the fiducial tracking by using CCV as the cross platform. The tracking was unstable. Jazmi told us that it might be cause by the lighting and also the camera itself. 

Test Out video clip-Set Up
Test Out video clip- fiducial tracking

Thursday 1 December 2011

Scripting!!!

Scripting is one of the more trigger and difficult part for me because I'm not really good in programming. However, after some researches, I found out that the software needed in creating this project is:


Adobe Flash (scripting)


Community Core Vision (CCV)
http://ccv.nuigroup.com/
An open source/cross-platform solution for computer vision and machine sensing. it takes an video input stream and outputs tracking data and events that are used in building multi-touch applications. It is use for finger, fiducial and object tracking for interactive works like multi-touch and tangible interface. CCV can interface with various web cameras and video devices as well as connect to TUIO/OSC?XML.


TUIO
http://www.tuio.org/
It is an open framework that defines a common protocol and API for tangible multitouch surfaces. The TUIO protocol allows the transmission of an abstract description of interactive surfaces, including touch events and tangible object states. This protocol encodes control data from a tracker application (e.g. based on computer vision) and sends it to any client application that is capable of decoding the protocol. 


Flosc
http://benchun.net/flosc/
A standalone application written in Java that sends and receives OSC packets via UDP, translates bidirectionally between binary OSC packets and an XML encoding of OSC packets, and sends and receives XML entities via TCP in a way that’s compatible with Flash’s XMLSocket feature.


After knowing the possible working software for the project, it is time for flash actionscript 3 sources. OMG~~I get some resources from NuiGroup but it seem like i do not really understand the whole structure of the script. I think I should meet up Yi Wei or Jazmi for further understanding towards the scripting.(* luckily we been touched CCV the semester before, so I think I still remember how it work^^)