Skip to main content


Showing posts from April, 2010


Probably due to the predominant position of a few actors (Google, Amazon, Facebook, etc.), I often feel that all I used to learn about small size systems will soon be obsolete. The current fashionable IT word is noSQL. If you don't have a minimum of IT culture and only read the buzzy articles you could think the SQL thing is dead. After a short analysis, I collected a few answers and mainly understood, both old and new technologies will have a long life, but their opposition really makes us step back and reconsider. First, there is ACID for atomicity, consistency, isolation and durability, the properties implemented by the Database management system (DBMS) to make sure the transactions proceed reliably. Most of the time we use databases of this type in our indoor computing system. But when it comes to distributed ones (many nodes) you've got to understand that a choice has to be made. In 2000, Eric Brewer made a keynote speech at the ACM Symposium on the Principles of D

GWT Quake in the browser

2010, the browser is a platform. Step by step, HTML5 arrives. Three Googlers present their 20% project: a port of the Quake II engine to HTML5 using the Google Web Toolkit. For more information, please visit Quake2-gwt-port . In the projet page, you read: In the port, we use WebGL, the Canvas API, HTML 5 elements, the local storage API, and WebSockets to demonstrate the possibilities of pure web applications in modern browsers such as Safari and Chrome. The port is based on the Jake2 project, compiled to Javascript using the Google Web Toolkit (GWT). Jake 2 is a Java port of the original Quake II source code, which was open sourced by id software. To make the Jake 2 code work with GWT, we have Created a new WebGL based renderer Ported the network layer for multiplayer games from UDP to the WebSocket API Made all resource loading calls asynchronous Created a GWT implementation of Java nio buffers based on WebGL arrays (to be ported to ECMAScript Typed Arrays ) Impl

Toward public cloud

We are now living in the second decade of the 21st century and having data/services online is not an advantage anymore, but not having it is a disavantage. We used to deploy servers in data centers to make sure our customers have permanent access to our services or to give the sensation that we were always online. Our internal IT infrastructure was changing step by step and the recurrent question used to be: should we based everything around Microsoft to minimise the products' compatibility or should we take the best of every product and support their interaction ourselves? We were living with the philosophy of IT perimeters delimited by security devices even if users were spending their time collaborating with others, sharing their documents by emails located somewhere. Somewhere connected. Today, everything looks different due to our understanding of news trends and the near future: the soon to be soaring energy price, the unavoidable trend of collaboration pushed by social