LiveChromaKey + LivePointers = AR Presentation!

I am a newbie for the wonderful ActionScript world and have worked hard for these two weeks to write a couple of pure ActionScript 3.0 libraries, LiveChromaKey and LivePointers. Then I gave a talk at Spark project's SparkStudy/09 on May 28th.

SparkStudy is a monthly meeting for the cutting-edge ActionScript developers in Tokyo. It's hosted by Yoshihiro Shindo a.k.a. yossy. This was the first time for me, an ECMAScripter, to attend it but I enjoyed it.

As recently posted, LiveChromaKey is a bluescreen-less image synthesizing engine for AR.

And the new library of LivePointers is a color detection engine to handle something as the new style of human interface devices with webcam. This means, for example, fingercap would be the 3D pointing device.

Anyway, you can try my presentation:

Yes, I was in the live projector screen during my presentation like weather newscaster did in weather news program! It would be nice for the audiences who seated at the end of the row in the room to see me on the screen. :)

In addition, I used fingercaps to manipulate slides, ex. right hand means next page. The fingercap I used costs only JPY 105, approximately USD 1.00- for six fingers. I'm sure that the cheap cap would be definitely important user interface device of the future!

Note that you can flip to the next page by right key of your keyboard as an insurance for the live presentation. :)

If you prefer a classical style of the slides, try slideshare:

LivePointers library is still under development. You can try the current snapshot on the Spark project's repository:
* Orignal post of this was written in Japanese.

Tokyo Cloud Developers Meetup #02 feat. Google App Engine

We'll welcome Fred Sauer, Google Developer Advocate,as the special guest of the second Tokyo Cloud Developers Meetup on June 10. Come over on June 10 to enjoy the latest topics around Google App Engine.

Register now!




LiveChromaKey - Bluescreen-less augmented IN reality (AR)

I wrote a new ActionScript library LiveChromaKey which is an image synthesizing engine for AR. The key point is it does not make something augmented ON reality but also augmented IN reality. It really portable as it never need blue background screen. The key color of chroma key is automatically detected on the fly.

Demo #1

Try: Travelling In Egypt

An webcam is needed to try this. Blue background screen is not needed. At first, hide out from the camera for LiveChromaKey to recognize the background view. After few seconds, the pyramid of Khafre will be shown. It's time to play it! Now you can feel free you are travelling in Egypt. :)

In case you move camera, click the screen to re-recognize background on demand. Then hide out and wait for few seconds again. You may need to turn off your camera's intelligent features like automatic white balance, automatic exposure compensation, etc. LiveChromaKey does not like such tunings.

Demo #2

Try: Minority Report-like Demo

This demo floats some photos in your back. Finger pointing reorganization like the Minority Report movie is not implemented at this time.

Demo #3

Try: Four Sprites Of LiveChromaKey

This demo shows the four sprites which LiveChromaKey provides.

LiveChromaKey Sample Code

var chromakey:LCK_Core = new LCK_Core();

var spLive:Sprite = chromakey.getLive();
var spBack:Sprite = chromakey.getBackground();
var spMask:Sprite = chromakey.getMask();
var spFore:Sprite = chromakey.getForeground();

this.addChild( spLive );
this.addChild( spBack );
this.addChild( spMask );
this.addChild( spFore );

The getLive() method returns a sprite which shows the live video. You can use it as a background for your app.

The getBackground() method returns a sprite which shows the stationary background image. You can use it as a background for your app as well.

The getMask() method returns a transparent sprite which shows the blue mask image. You may not use it normally.

The getForeground() method returns a transparent sprite which shows the dynamic foreground image. It would contain a person or objects in front of camera. You can use it as a foreground for your app.

LiveChromaKey Properties

Set properties below before call init() method.
chromakey.captureX = 320;
chromakey.captureY = 240;
chromakey.captureFPS = 30;
Web camera input source's resolution and frame rate
chromakey.displayX = 640;
chromakey.displayY = 480;
chromakey.smoothing = false;
Output sprites' resolution and smoothing.
chromakey.workX = 80;
chromakey.workY = 60;
Working resolution by pixel.

The following method and property is for after it started.
Method to re-recognize stationary background.
Boolean value for stationary background is detected.

How To Compile It

Download LiveChromaKey source code from the Spark project.
svn co livechromakey

* Original posts for demo #1, demo #2 and demo #3 are written in Japanese.

HTML5.Audio - JavaScript MP3 Player Library (HTML5-like)

HTML5 allows <audio> element to play MP3 and other sound formats by HTML and JavaScript. But HTML5 is still not in major. So I wrote HTML5.Audio which is a JavaScript library to play MP3 music via Flash.

Demo #1: Play sound by HTML5.Audio

Demo #2: MP3 player

Before Using It

You can manipulate HTML5.Audio object like HTML5's Audio object.
To prepare it, load three JavaScript files on HTML header.
<script type="text/javascript" src="js/swfobject.js"></script> 
<script type="text/javascript" src="js/jkl-js2as.js"></script>
<script type="text/javascript" src="js/html5-audio.js"></script>

Plat MP3

Create HTML5.Audio instance with MP3 file URL. Then, call play() method to play it. It totally simple like HTML5's Audio object.
<script type="text/javascript"><!--

music = new HTML5.Audio('sound.mp3');;

Note that HTML5.Audio library support MP3 files at this time.
WAV file and other sound formats are not supported. ActionScript's Sound class limits it.


HTML5.Audio library supports some properties imported from HTML5's Audio object: currentTime, volume, paused, ended, loop and duration
You need to call set()/get() method to set/get properties.
var ctime = music.get('currentTime');

Note that currentTime property will be updated only when playing music is started and paused.


HTML5.Audio library supports some of events callbacked: onloadstart, onload, onplay, onpause and onended
You need to call set() method to set callback function for event property.
var onended = function () {
alert( 'sound ended' );

music.set( 'onended', onended );


HTML5.Audio library loads html5-audio.swf flash file which was written by ActionScript 3.0.
To set path for html5-audio.swf, call getProxy() method before creating the first instance of HTML5.Audio.
<script type="text/javascript"><!--

HTML5.Audio.Proxy.getProxy({swfPath:'./html5-audio.swf',onready: init});

Note that onready property is an event which will invoked when HTML5.Audio library was ready to play. You can call getProxy() method before window.onload event was invoked.


  • js/html5-audio.js - HTML5.Audio library core (JS part)
  • js/jkl-js2as.js - JS-AS bridge (JS part)
  • js/swfobject.js - Library to load flash file
  • swf/html5-audio.swf - HTML5.Audio library Flash binary
  • swf/expressInstall.swf - Says install/update Flash!


Check out files from the Spark project repository:
svn co html5-audio

* Original post of this was written in Japanese at 2009/05/17 17:40

JSARToolKit - AR (Augmented Reality) by JavaScript

After my talk at OSDC.TW 2009 in Taipei, I've released JSAR's source code on the Spark project's repository:

JSARToolKit is a JavaScript library to run AR (augmented reality).
This is the first JavaScript project on the Spark. :)

Demo #1 - Show Logo

Try: JSAR Logo Demo
Download and print marker pdf: Maker PDF (JSAR Logo Only)

Demo #1 shows a label "JSAR" in DIV element overlayed on Flash. Red square border on marker was drawn by canvas. It means both of label and lines are controlled by JavaScript not by ActionScript.

Demo #2 - Mic Volume

Try: Mic Volume Demo
Download and print markers pdf: Makrers PDF (4 patterns)

Demo #2 shows a label for each markers and changes its font size effected by microphone volume inputed.

Sample Code

<script type="text/javascript" src="../js/swfobject.js"></script>
<script type="text/javascript" src="../js/jsar.js"></script>
<script type="text/javascript"><!--

  var jsar;
  function init () {
    jsar = new JSAR( 'jsar_here' );
    jsar.drawMarkerRect = true;
    jsar.onDetected = function ( result ) { ... };
    jsar.onLost = function ( result ) { ... };
    jsar.captureX = 320;
    jsar.captureY = 240;
    jsar.displayX = 640;
    jsar.displayY = 480;
    jsar.setMarker( [ '../code/jsarlogo.pat' ] );
  window.onload = init;
<div id="jsar_here"></div>

How To Compile It

JSARToolKit uses FLARToolKit in it. This means JSAR is not pure JavaScript, but flash powered. Download JSARToolKit from the Spark project by svn command, and compile it by Flash CS4 or by FlashDevelop + Flex SDK.

svn co jsar
mkdir -p jsar/src/org/libspark
svn co jsar/src/org/libspark/flartoolkit

In fact, you can use jsar.swf pre-compiled. So you don't need to compile it by your self.

I need say thank you to Saqoosha who has developed FLARToolKit.

* Original post of this was written in Japanese at 2009/05/05 01:46.

tdserver - An Experimental HTTP Interface for Tokyo Dystopia

I just wrote an experimental HTTP interface for Tokyo Dystopia. Tokyo Dystopia is an open source full-text search system using Tokyo Cabinet which is very fast key-value storage by Mikio Hirabayashi. Some code is derived from Tokyo Tyrant which is a network interface of Tokyo Cabinet.

tdserver is also an open source project on CodeRepose.


Compile Tokyo Cabinet, Tyrant, Dystopia and then tdserver.
You don't need to install them at this time.

tar zxvf tokyocabinet-1.4.17.tar.gz
tar zxvf tokyotyrant-1.1.23.tar.gz
tar zxvf tokyodystopia-0.9.11.tar.gz

cd tokyocabinet-1.4.17
./configure && make
cd ..

cd tokyotyrant-1.1.23
CFLAGS=-I../tokyocabinet-1.4.17 LDFLAGS=-L../tokyocabinet-1.4.17 ./configure && make
cd ..

cd tokyodystopia-0.9.11
CFLAGS=-I../tokyocabinet-1.4.17 LDFLAGS=-L../tokyocabinet-1.4.17 ./configure && make
cd ..

svn co tdserver
cd tdserver


See help message by -h option.
./tdserver -h
./tdserver: A server of Tokyo Dystopia

./tdserver [-host name] [-port num] [-thnum num] [-tout num] [-dmn] [-pid path] [-kl] [-log path] [-ld|-le] [-sid num] [-mask expr] [-unmask expr] [dbname]


tdserver will create td_base directory then listen port 1977 as a HTTP server.
./tdserver -port 1977 td_base


Three HTTP methods, GET, PUT and DELETE are accepted as RESTful interface.
Try it by Perl as follows.

* Insert (PUT method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(PUT "http://localhost:1977/1",Content=>"hello world")->as_string;'
This inserts a text "hello world" as a document #1.
ID# must be a positive numeric.

* Fetch (GET method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(GET "http://localhost:1977/1")->as_string;'
This returns document #1 directly.

* Search (GET method with query string)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(GET "http://localhost:1977/?q=hello")->as_string;'
This searchs documents which contain phrase "hello" and returns ID numbers comma separated.

* Remove (DELETE method)
perl -MLWP::UserAgent -MHTTP::Request::Common -e 'print LWP::UserAgent->new->request(HTTP::Request::Common::DELETE "http://localhost:1977/1")->as_string;'
This removes document #1.

Feed backs and patches on CodeRepos are really welcomed.

JSAR (JavaScript Augmented Reality) at OSDC.TW 2009 Taipei

Last weekend, I flew to Taipei to attend the OSDC.TW 2009, OpenSource Developers Conference in Taiwan. I had a talk titled "The JUI Digest Taipei" there. JUI means JavaScript User Interface:

The JUI 2008 Tokyo (first)
The 2nd JUI Conference in Adobe MAX Japan 2009

In addition to the recent topics in the 2nd JUI, I showed the JASR (JavaScript Augmented Reality) demonstration.

JSARToolKit is a library to run AR by JavaScript. It works as a proxy wrapper for a bridge application using FLARToolKit.

Try it : JSAR Demo #1
Get marker PDF : JSAR logo for print
Get marker PDF : 4 markers for print

Then, today's main dish was "AiR Xiaolongbao".
A dozen XiaoLongBao (小籠包) were shown on the table by JavaScript!

Try it : AiR Xiaolongbao Demo
Get marker PDF : XiaoLongbao markers for print (free)

This was a hommage for Air Yakiniku (Air焼肉) as a Chinese version.

Some of front-row seated attendees helped me to show that JSAR supports multiple markers. Jesse drew his improvisatorial "JSAR" marker whis was fainally recognized at the last of my talk.

See my slides on slideshare.

This was my sencond trip to Taipei. We really enjoyed there again. Last year, I gave another talk titled DOM manipulation by Wiimote/Gainer over HTTP in OSDC.TW 2008. That was also first time for me to talk about Wii Remote. I evolved it later and talked it in some other YAPCs places. I guess I will talk the JSAR again for other conferences this year as well. See you soon!

BTW, JSARToolKit uses FLARToolKit internally. I must say thank you for Saqoosha who is one of the most cool Japanese Flash guys. And he will give a talk about FLARToolKit in the FITC Tronto 2009 conference this weekend. Don't miss it!

The First Tokyo Cloud Developers Meetup was over.

As I mentioned, we had the first Tokyo Cloud Developers Meetup this Thursday with great success. It's got 64 registrants which is over capacity for the seminar room in Amazon Japan K.K. We'd like to have the next meetup. Join the group and stay tuned.

Photo by Gui Trento "Blue sky over the monday morning"

Above was the opening slide for the event. I love the picture of the beautiful blue sky with clouds. Thanks, Gui.


Jeff Barr (Amazon)

He gave us a talk in English without interpretation.
I guess most of Japanese attendees could understand most of what he said with help of his slides and demos.
He talked a bit faster in last half though. :-)

The busy evangelist had other conferences in Japan, the QCon Tokyo 2009 and an ITpro Technology Conference for Amazon Cloud Services. They cost JPY 30,000 for each, approximately $300. The guys who attended the more techy meetup were lucky because it's free of charge.

After his talk, Japanese developers gave lightening talks.

Tightening Talks

1. Yamazaki Yasuhiro - slideshare
2. Yuki Namikawa
3. Takao Funami - slideshare
4. Manabu Igarashi
5. Yukio Ando - slideshare

These were more techy and interesting.

This was my first time to come into Amazon's place in Japan. Nonetheless they provide the most popular cloud services in the world, they don't provide a connection to the cloud from the conference room in due to their security policy though. :-)
Anyway, most attendees surprised and regarded Amazon's nice office.

At last, I must say again thank you to Jeff Barr.
We're longingly awaiting the news from Amazon that provides the EC2 Asia region service with Japan zone servers.
Original post of this was written in Japanese.

Tokyo Cloud Developers Meetup on April 9

Jeff Barr

Jeff Barr, Amazon's Senior Manager of Cloud Computing Solutions, will be visiting Tokyo April 9-13 as part of his Asia-tour. Come over on April 9 to hear the latest news on Amazon's plans Web Services at an informal developer meetup. Peter and me are the organizers of the event.


  • Date and Time: April 9, 2009 from 19:30 to 21:00 (doors open 19:00)
  • Location: Amazon Japan K.K. [map]
  • Address: Shibuya Cross Tower, 2-15-1 Shibuya, Shibuya-ku. Tokyo


Tokyo Cloud
  • Keynote by Jeff Barr, Senior Amazon Evangelist (
  • Lightning talks by AWS experts in Japan (We're looking for speakers!)
  • Q&A / Free discussion
After the meetup, we'll have a nomikai at Tengu in Shibuya. [map]
The cost will be split amongst all participants, probably around 3-4,000 yen per person.

Only 40 seats available so sign up quickly!
Register now by ATND or just send an email for us:
Oh, and please let us know beforehand if you would like to join the nomikai or not.

* Note that this is NOT an official event by Amazon Japan K.K.

The history of JavaScript's 3D tech development

Before most of popular browsers start to support canvas "3d" context, we JavaScript developers have struggled how to implement to enable 3D by JavaScript without any extensions like Java, Flash, etc. Here is a part of the history of JavaScript's 3D tech development.

Animation.Cube - April 2006

Three years ago, I wrote a library named Animation.Cube which slices images into many vertical lines to show rotating cube. I demonstrated this at the first technical talks of Shibuya.js community. The code is on JSAN. See also digg.

Triangles by Border of Div - October 2006

Useless Pickles (Jeff Lau) shows polygons drawn by many triangles made by <div> elements using trick of borders. It means we could develop Virtua Fighter (1) by JavaScript.

3D by Canvas - March 2008

Again, I wrote a new demo using <canvas> element to draw wireframe image and polygons as well. The code was written for another demo to manipulate Wii Remote controllers, I had sessions talked at OSDC.TW 2008 (Taipei), YAPC::Asia 2008 (Tokyo), YAPC::NA 2008 (Chicago), and YAPC::Europe 2008 (Copenhagen) conferences.

3D Renderer with Textures - March 2008

At just about the same time, Jacob Seidelin gave a great demo with texture mapped polygons using <canvas> element.

Triangle Texture Mapping on Wii - April 2008

Daniel Gump released Wii Opera SDK which had triangle texture-mapping feature. It is a SDK for Nintendo Wii's Internet Channel. He said it could show 500 textured triangles per second on Wii.

Motion Blur - May (maybe) 2008

Kaarel Lumi represented a beautiful motion blur using alpha blending tech by fillStyle. Thanks, @moriyoshi.

Projective Texturing - November 2008

Steven Wittens also wrote projective transform renderer. I'm interested in the tech which make an adjustment on size of the image cutted. It makes many cuts for front pieces.

Sphere Environment Mapping - February 2009

Satoshi Ueyama reported that Chrome had extremely fast canvas rendering engine named Skia by demonstrating his demos and benchmarks. He also described in detail how to implement texture mapping by canvas in his post. He also implemented physical computing and sphere environment mapping feature on it. His demos showed us that now we could run JavaScript 3D by real-time / daily-usable performance on Chrome. And I guess rest of popular browsers will soon come to the stage. See also my post.

Tweet this - a bookmarklet to post URL to Twitter

This is my first bookmarklet.
The bookmarklet posts the current page's title and URL to Twitter. I guess many of similar bookmarklets would be found somewhere though.

Tweet this

Drag & drop the link above to your browser's bookmark toolbar.
Its code is referred to delicous's bookmarklet.
javascript:(function(){f=''+encodeURIComponent(document.title+' '+window.location.href+' ');a=function(){if(!,'_blank'))location.href=f};if(/Firefox/.test(navigator.userAgent)){setTimeout(a,0)}else{a()}})()

Thanks, delicious!
I don't know the license type of the code snippet, however. ;-)

Incredible JavaScript+Canvas 3D demos from Japan!

Mr. Satoshi Ueyama hacked out the new era of JavaScript 3D tech by unveiling the real of Google Chrome's power. Satoshi is one of the great JavaScript hackers in Japan, and also known as gyuque listed on the article of 30 Japanese geeks you should follow on Twitter. (cached)

He has introduced the brand new JavaScript technique using Canvas for 3D on his post. Browsing with Google Chrome is strongly recommended for all demos below.

I'm sure this could be one giant leap for JavaScript user interface technologies.

Demo #1: 3D texture mapped with physical computing

This is his first demo which shows 3D textured by Canvas. You can click on the cloth to make it waved.

Demo #2: Hatsune Miku 3D with OOP

The 3D Miku is OO-style implemented as an object which has swing() method. This means she swings green onion in her hand when swing() method is called. Click on her.

She wears hundreds of polygons. The demo was created for Paul Bakaus, jQuery UI lib's lead, visiting Japan.

Demo #3: 3D iPod touch with environment mapping

You may find the all demos above have the same width as 480px. Yes, it is definitely same as the horizontal viewport size which iPhone and iPod touch have. Now you can see iPod touch displayed on your iPod touch by the demo #3!

It could be slightly slow on ITSELF, however, it's still cool. These demos show us that we rarely need Flash anymore on the mobile platform, right? ;)

But the most important point of the demo is not its width. You need to see behind of it. He also implemented the reflection mapping feature on his lib.

The all demos above are coded only by pure JavaScript using Canvas. Here are no Java, Flash, ActionScript etc. but just the Web standards.

You can see Google Chrome runs all demos much faster than any other browsers. It has a great rendering engine named Skia. CPUs are already enough fast to calculate most things, you know, and now JITs are implemented as well. But canvas renderers admit of their performance. Skia is special. Safari is also fine. Firefox, 3.0 and 3.1 JIT-ed, are seem to be slower, unfortunately. We don't have to say that IE have no capability to run these.

Gyuque runs several benchmarks to describe Skia has significant advantage on its canvas rendering engine.

The 3D JavaScript tech could be one of the killer applications of Chrome now. I guess the next generation of browsers will soon compete on Canvas performance.

Original of this post was written for my Japanese blog.
You must see more detail on gyuque's post.

The 2nd JUI Conference in Adobe MAX Japan 2009

Almost one year has been past since we held the 1st JUI conference in Tokyo. The JUI is a conference forcused into user interface techs using JavaScript. At he end of this January, we held the JUI again in Adobe MAX Japan 2009 conference as a sub conference. I think it's definitely true that Adobe is really big-hearted company. Five of JavaScript guys could talked only about JavaScript at the session which has the sub-title of "we don't need FLash any more!" in Japanese.

At first, I gave an introduction talk.

JavaScript Hot Topics 2008 (Adobe MAX Edition)
#10 - 10th Anniversary of MM_SwapImage()
#9 - Shibuya.js comes to Kyoto
#8 - ECMAScript 4 failed. Now 3.1 instead.
#7 - Adobe launches Flash 10
#6 - iPhone 3G integrated with JavaScript
#5 - Many companies switching to JavaScript
#4 - Microsoft follows web standards by IE8
#3 - Varieties of JavaScript libraries
#2 - Playing .swf by JavaScript on the scene
#1 - Too Rapid JavaScript. No JIT, No Life.

The 2nd speaker was Yu Kobayashi a.k.a. yukoba who was the author of HotRuby virtual machine. His talk was about "How to implement a Flash Player."

The next was Satoshi Ueyama a.k.a. gyuque. He also implemented another Flash Player named "JSplash" which had a trick to translate ActionScript code to JavaScript code. This pre-compiling feature gave it enough performance compared to Adobe's native Flash Player. That means we don't need flash player any more.

The 4th was Moriyoshi Koizumi who was one of Japanese PHP committers. He gave a talk about "JavaScript's Sound Generation" using data scheme.

The last was Hotoshi Amano a.k.a. amachang. He always starts to develop new presentation tool before he writes slides for each conferences. Now he made it in 3D. Note that only nightly build of WebKits - at that time, now Safari 4 is available - could show his slides. His talk itself was about "DOM Performance Tuning."

Anyway, in the Adobe's conference, we're happy to let them know these crazy JavaScript guys're existing and working hard for the high level of techniques and user experience.

The 2nd JUI Speakers

The original post of this was written in Japanese.