Weight Table
Home
Weight Table

The Weight Table

The Weight table is an odinary dining or coffee table augmented with load  cells.  We have augmented several tables by unobtrusively adding load cells to each corner. These sensors enable us to detect interactions on the table along with information such as position and weight about the objects that are placed or removed from the table. We address the major challenge to augment the table without disturbing the everyday usage of it.

How does it work?

The table is based on our DIY Smart-its platform. We designed a load add-on board that interfaces the four industrial load cells in each corner. By observing the load changes of the load cells we determine event types like placement and removal of objects, or a person moving a finger across the surface. We are also able to retrieve the position of an object by correlating each individual load cells values after the event type (put or placed) has been recognised.

This is a picture of the coffee table version. In this version we attached the load cells between the frame and the table top. The table top rests on the load cell so that they are not perceivable to the user. In fact, the table can be used without any restrictions as a normal coffee table.

The dining table variant uses different type of load cells in order to detect higher weights. In this variant we put the load cells under the legs of the table. Thus, everybody can have an augmented table. The instructions would be:

  • Buy a table with correct leg measurements from you favourite furniture store (we chose the NORDEN dining table from IKEA)
  • Get the “Table Augmentation Kit” (Load cells with boxes, power adapter, smart-it with add-on, serial cable)
  • Put the boxes with the load cells under each leg and attach the wires to the table
  • Attach the smart-its to the table and connect the load cells
  • Choose your favourite table application and “Switch on the table”

Of course the table can be used as unobtrusive as the coffee table version.

Further technical information including code examples are available from Smart-its Wiki pages.

Applications

The weight table has been used in three different applications:

  1. As a pointing device
  2. As a basis for a reconfigurable tangible interface
  3. For tracking of everyday objects in Interactive Environments

We demonstrated the idea of augmenting existing objects in a living environment to control existing desktop applications in our paper at UI4ALL. We suggested the weight table as a solution to the  challenge of not altering the everyday usage of the device and its affordances while still providing additional functionality. We are developing this idea by using the weight table as a basis for a reconfigurable tangible interface. The idea is based on a small set of simple gestures using a set of arbitrary physical objects and their recognition by the weight table. As a starting point for appropriate “table interfaces” we describe in our AIMS paper how a user is able to dynamically reconfigure and thus personalize the interface to the controlled application.

At the Emerging Technologies Exhibition at SIGGRAPH03 we have demonstrated how the table can be part of an interactive environment that recognizes user interactions by tracking Smart-its tagged objects like glasses and water jugs.

The idea is based on a system of Smart Objects, i.e. the table, glasses and jugs that are autonomously able to detect interaction events related to themselves and to increase their knowledge about their current vicinity by collaboratively sharing related information. In the SIGGRAPH demo the weight initially only knows that an unknown object has been put on its surface at position (x,y) with weight w. Similarily the glasses and jugs only know about their identity and are able to detect that they have been put on a unknown surface. However, both objects broadcast these events wirelessly. By using time to correlate the events we are able to detect the position of an identified object on an identified surface. The weight information is used to infer the filling state of the glass or jug respectively.

In the current implementation the detection of interaction events is implemented completely on the Smart-its. The glasses and the jug use pressure sensors and the Mini Core Boards. The jug also had a temperature sensor attached to monitor the water within. All objects broadcast their interaction events wirelessly to a Display Server (C#) that  correlates the events and displays the recognized situation. However, we plan to implement the correlation on the Smart-its making it truly independent from any infrastructure. The source is available here.

Following the same idea of easy augmentation of everyday objects the table was surrounded by four Smart-its augmented chairs. Pressure sensor were attached under each corner of the seat and connected to the Smart-its. The Display Server highlighted the corners of the seat providing feedback of the load distribution.

Possible Applications Domains for the object tracking scenario include Restaurant Settings or Office Spaces in which the chairs add to the information from the table and glasses to infer activity and meeting situations.

Publications

  1. C. Kray and M. Strohbach. “Gesture-based interface reconfiguration”. Workshop on Artificial Intelligence in Mobile Systems (AIMS) at Ubicomp 2004, Seattle, U.S.A.  To appear. [pdf] [Video .mp4 - 5 MB] [Video .avi 50MB]
  2. Lars Erik Holmquist, Stavros Antifakos, Bernt Schiele, Florian Michahelles, Michael Beigl, Lalya Gaye, Hans-Werner Gellersen, Albrecht Schmidt and Martin Strohbach: Building Intelligent Environments with Smart-Its. SIGGRAPH 2003, Emerging Technologies exhibition. [pdf] [Video]
  3. A. Schmidt, M. Strohbach, K. Van Laerhoven and H.-W. Gellersen. "Ubiquitous interaction - Using surfaces in everyday environments as pointing devices". In Lecture Notes in Computer Science (LNCS), Volume 2615, N. Carbonell & C. Stephanidis (Eds.). Springer Verlag, 2002, pp. 263-279. [pdf] [~50 MB VIdeo] [~20MB Video] [~3MB Video]
  4. A. Schmidt, M. Strohbach, K. Van Laerhoven, A. Friday and H.-W. Gellersen. "Context Acquisition based on Load Sensing". In Proceedings of Ubicomp 2002, G. Boriello and L.E. Holmquist (Eds). Lecture Notes in Computer Science, Vol 2498, ISBN 3-540-44267-7; Springer Verlag, Gothenburg, Sweden, September 2002, pp. 333 - 351. [pdf]

Press Coverage

The following is list of press coverage we got from our SIGGRAPH exhibition:

Related Projects

People

These people are or have been involved in contributing to the weight table projects (alphabetical order)

Contact

Martin Strohbach - strohbach@comp.lancs.ac.uk

 

The Smart-its “Table Augmentation Kit”

Attaching the smart-its to the table

The weight table as a pointing device

Table Setup at SIGGRAPH 2003

Augmented Water Jug

Close-up of a chair with transparent seat

[Home] [Weight Table]