LiveChromaKey + LivePointers = AR Presentation!

I am a newbie for the wonderful ActionScript world and have worked hard for these two weeks to write a couple of pure ActionScript 3.0 libraries, LiveChromaKey and LivePointers. Then I gave a talk at Spark project's SparkStudy/09 on May 28th.

SparkStudy is a monthly meeting for the cutting-edge ActionScript developers in Tokyo. It's hosted by Yoshihiro Shindo a.k.a. yossy. This was the first time for me, an ECMAScripter, to attend it but I enjoyed it.

As recently posted, LiveChromaKey is a bluescreen-less image synthesizing engine for AR.

And the new library of LivePointers is a color detection engine to handle something as the new style of human interface devices with webcam. This means, for example, fingercap would be the 3D pointing device.

Anyway, you can try my presentation:

Yes, I was in the live projector screen during my presentation like weather newscaster did in weather news program! It would be nice for the audiences who seated at the end of the row in the room to see me on the screen. :)

In addition, I used fingercaps to manipulate slides, ex. right hand means next page. The fingercap I used costs only JPY 105, approximately USD 1.00- for six fingers. I'm sure that the cheap cap would be definitely important user interface device of the future!

Note that you can flip to the next page by right key of your keyboard as an insurance for the live presentation. :)

If you prefer a classical style of the slides, try slideshare:

LivePointers library is still under development. You can try the current snapshot on the Spark project's repository:
* Orignal post of this was written in Japanese.

Tokyo Cloud Developers Meetup #02 feat. Google App Engine

We'll welcome Fred Sauer, Google Developer Advocate,as the special guest of the second Tokyo Cloud Developers Meetup on June 10. Come over on June 10 to enjoy the latest topics around Google App Engine.

Register now!




LiveChromaKey - Bluescreen-less augmented IN reality (AR)

I wrote a new ActionScript library LiveChromaKey which is an image synthesizing engine for AR. The key point is it does not make something augmented ON reality but also augmented IN reality. It really portable as it never need blue background screen. The key color of chroma key is automatically detected on the fly.

Demo #1

Try: Travelling In Egypt

An webcam is needed to try this. Blue background screen is not needed. At first, hide out from the camera for LiveChromaKey to recognize the background view. After few seconds, the pyramid of Khafre will be shown. It's time to play it! Now you can feel free you are travelling in Egypt. :)

In case you move camera, click the screen to re-recognize background on demand. Then hide out and wait for few seconds again. You may need to turn off your camera's intelligent features like automatic white balance, automatic exposure compensation, etc. LiveChromaKey does not like such tunings.

Demo #2

Try: Minority Report-like Demo

This demo floats some photos in your back. Finger pointing reorganization like the Minority Report movie is not implemented at this time.

Demo #3

Try: Four Sprites Of LiveChromaKey

This demo shows the four sprites which LiveChromaKey provides.

LiveChromaKey Sample Code

var chromakey:LCK_Core = new LCK_Core();

var spLive:Sprite = chromakey.getLive();
var spBack:Sprite = chromakey.getBackground();
var spMask:Sprite = chromakey.getMask();
var spFore:Sprite = chromakey.getForeground();

this.addChild( spLive );
this.addChild( spBack );
this.addChild( spMask );
this.addChild( spFore );

The getLive() method returns a sprite which shows the live video. You can use it as a background for your app.

The getBackground() method returns a sprite which shows the stationary background image. You can use it as a background for your app as well.

The getMask() method returns a transparent sprite which shows the blue mask image. You may not use it normally.

The getForeground() method returns a transparent sprite which shows the dynamic foreground image. It would contain a person or objects in front of camera. You can use it as a foreground for your app.

LiveChromaKey Properties

Set properties below before call init() method.
chromakey.captureX = 320;
chromakey.captureY = 240;
chromakey.captureFPS = 30;
Web camera input source's resolution and frame rate
chromakey.displayX = 640;
chromakey.displayY = 480;
chromakey.smoothing = false;
Output sprites' resolution and smoothing.
chromakey.workX = 80;
chromakey.workY = 60;
Working resolution by pixel.

The following method and property is for after it started.
Method to re-recognize stationary background.
Boolean value for stationary background is detected.

How To Compile It

Download LiveChromaKey source code from the Spark project.
svn co livechromakey

* Original posts for demo #1, demo #2 and demo #3 are written in Japanese.

HTML5.Audio - JavaScript MP3 Player Library (HTML5-like)

HTML5 allows <audio> element to play MP3 and other sound formats by HTML and JavaScript. But HTML5 is still not in major. So I wrote HTML5.Audio which is a JavaScript library to play MP3 music via Flash.

Demo #1: Play sound by HTML5.Audio

Demo #2: MP3 player

Before Using It

You can manipulate HTML5.Audio object like HTML5's Audio object.
To prepare it, load three JavaScript files on HTML header.
<script type="text/javascript" src="js/swfobject.js"></script> 
<script type="text/javascript" src="js/jkl-js2as.js"></script>
<script type="text/javascript" src="js/html5-audio.js"></script>

Plat MP3

Create HTML5.Audio instance with MP3 file URL. Then, call play() method to play it. It totally simple like HTML5's Audio object.
<script type="text/javascript"><!--

music = new HTML5.Audio('sound.mp3');;

Note that HTML5.Audio library support MP3 files at this time.
WAV file and other sound formats are not supported. ActionScript's Sound class limits it.


HTML5.Audio library supports some properties imported from HTML5's Audio object: currentTime, volume, paused, ended, loop and duration
You need to call set()/get() method to set/get properties.
var ctime = music.get('currentTime');

Note that currentTime property will be updated only when playing music is started and paused.


HTML5.Audio library supports some of events callbacked: onloadstart, onload, onplay, onpause and onended
You need to call set() method to set callback function for event property.
var onended = function () {
alert( 'sound ended' );

music.set( 'onended', onended );


HTML5.Audio library loads html5-audio.swf flash file which was written by ActionScript 3.0.
To set path for html5-audio.swf, call getProxy() method before creating the first instance of HTML5.Audio.
<script type="text/javascript"><!--

HTML5.Audio.Proxy.getProxy({swfPath:'./html5-audio.swf',onready: init});

Note that onready property is an event which will invoked when HTML5.Audio library was ready to play. You can call getProxy() method before window.onload event was invoked.


  • js/html5-audio.js - HTML5.Audio library core (JS part)
  • js/jkl-js2as.js - JS-AS bridge (JS part)
  • js/swfobject.js - Library to load flash file
  • swf/html5-audio.swf - HTML5.Audio library Flash binary
  • swf/expressInstall.swf - Says install/update Flash!


Check out files from the Spark project repository:
svn co html5-audio

* Original post of this was written in Japanese at 2009/05/17 17:40

JSARToolKit - AR (Augmented Reality) by JavaScript

After my talk at OSDC.TW 2009 in Taipei, I've released JSAR's source code on the Spark project's repository:

JSARToolKit is a JavaScript library to run AR (augmented reality).
This is the first JavaScript project on the Spark. :)

Demo #1 - Show Logo

Try: JSAR Logo Demo
Download and print marker pdf: Maker PDF (JSAR Logo Only)

Demo #1 shows a label "JSAR" in DIV element overlayed on Flash. Red square border on marker was drawn by canvas. It means both of label and lines are controlled by JavaScript not by ActionScript.

Demo #2 - Mic Volume

Try: Mic Volume Demo
Download and print markers pdf: Makrers PDF (4 patterns)

Demo #2 shows a label for each markers and changes its font size effected by microphone volume inputed.

Sample Code

<script type="text/javascript" src="../js/swfobject.js"></script>
<script type="text/javascript" src="../js/jsar.js"></script>
<script type="text/javascript"><!--

  var jsar;
  function init () {
    jsar = new JSAR( 'jsar_here' );
    jsar.drawMarkerRect = true;
    jsar.onDetected = function ( result ) { ... };
    jsar.onLost = function ( result ) { ... };
    jsar.captureX = 320;
    jsar.captureY = 240;
    jsar.displayX = 640;
    jsar.displayY = 480;
    jsar.setMarker( [ '../code/jsarlogo.pat' ] );
  window.onload = init;
<div id="jsar_here"></div>

How To Compile It

JSARToolKit uses FLARToolKit in it. This means JSAR is not pure JavaScript, but flash powered. Download JSARToolKit from the Spark project by svn command, and compile it by Flash CS4 or by FlashDevelop + Flex SDK.

svn co jsar
mkdir -p jsar/src/org/libspark
svn co jsar/src/org/libspark/flartoolkit

In fact, you can use jsar.swf pre-compiled. So you don't need to compile it by your self.

I need say thank you to Saqoosha who has developed FLARToolKit.

* Original post of this was written in Japanese at 2009/05/05 01:46.

tdserver - An Experimental HTTP Interface for Tokyo Dystopia

I just wrote an experimental HTTP interface for Tokyo Dystopia. Tokyo Dystopia is an open source full-text search system using Tokyo Cabinet which is very fast key-value storage by Mikio Hirabayashi. Some code is derived from Tokyo Tyrant which is a network interface of Tokyo Cabinet.

tdserver is also an open source project on CodeRepose.


Compile Tokyo Cabinet, Tyrant, Dystopia and then tdserver.
You don't need to install them at this time.

tar zxvf tokyocabinet-1.4.17.tar.gz
tar zxvf tokyotyrant-1.1.23.tar.gz
tar zxvf tokyodystopia-0.9.11.tar.gz

cd tokyocabinet-1.4.17
./configure && make
cd ..

cd tokyotyrant-1.1.23
CFLAGS=-I../tokyocabinet-1.4.17 LDFLAGS=-L../tokyocabinet-1.4.17 ./configure && make
cd ..

cd tokyodystopia-0.9.11
CFLAGS=-I../tokyocabinet-1.4.17 LDFLAGS=-L../tokyocabinet-1.4.17 ./configure && make
cd ..

svn co tdserver
cd tdserver


See help message by -h option.
./tdserver -h
./tdserver: A server of Tokyo Dystopia

./tdserver [-host name] [-port num] [-thnum num] [-tout num] [-dmn] [-pid path] [-kl] [-log path] [-ld|-le] [-sid num] [-mask expr] [-unmask expr] [dbname]


tdserver will create td_base directory then listen port 1977 as a HTTP server.
./tdserver -port 1977 td_base


Three HTTP methods, GET, PUT and DELETE are accepted as RESTful interface.
Try it by Perl as follows.

* Insert (PUT method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(PUT "http://localhost:1977/1",Content=>"hello world")->as_string;'
This inserts a text "hello world" as a document #1.
ID# must be a positive numeric.

* Fetch (GET method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(GET "http://localhost:1977/1")->as_string;'
This returns document #1 directly.

* Search (GET method with query string)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(GET "http://localhost:1977/?q=hello")->as_string;'
This searchs documents which contain phrase "hello" and returns ID numbers comma separated.

* Remove (DELETE method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(HTTP::Request::Common::DELETE "http://localhost:1977/1")->as_string;'
This removes document #1.

Feed backs and patches on CodeRepos are really welcomed.