Seminar 4: Phidgets begin
Friday, October 26, 2007 at 09:59
martian77 in HCCS Adv

We had our introductory session on the phidgets today. The bits and pieces look quite fun, and actually programming them looks pretty easy. I don't think that's going to be the hard part. I think the tricky bit is going to be finding the right problem and solving it elegantly and in a universal manner.

We got into groups, and I'm working with Lizzie and Charlie again after Yves has been told he can't continue taking the course. We had a brief brain-storm session after the seminar, and tried a couple of different ways to come up with a problem to solve. First we limited our problem space to the home. Then we started thinking about problems that people with set disabilities may have around the home.

That wasn't actually terribly fruitful. We found that a lot of the problems we were identifying that way were quite specific to the disability, and people without it wouldn't need the solution. It just didn't really seem very universal. So we switched it around. We started trying to think of regular, every day kinds of problems, and thinking about how we could solve them for different types of user. And that did it. We hit upon the idea of things you need that are constantly getting lost. Like keys, wallet, phone, glasses etc. (in my case you could add work pass and train ticket to the list). There are solutions available already where you can get your keys to emit a noise (no good for deaf users).

We thought rather than get the item to emit a noise or light or anything we could use RFID tags to attach to the item, and have a handheld device that you could carry around that would alert you when you are near a tagged object. I think the idea is sound, but some of the design features are going to need some thinking about to get right. I had a quick blast of some questions that crossed my mind on the way into Brighton later:

  1. How do we alert someone? A screen change (or light) and/or noise.
  2. How do we tell which object we're close to?
  3. Or how close we are? We could either use different colours to indicate closeness, or different intensities of colour, or a flashing pattern which speeds up as we get closer. The noise could change pitch, volume or rhythm. How we monitor different objects will affect which solution we choose I think. So if we use a different note for each tag that's close, we probably don't want to change the pitch as we get close to something because the chord produced might be horrible. Multiple different rhythyms of either flashes or noises might get very difficult too, but how easy would different volumes be to detect? Do the RFIDs even support this kind of use, or is it a binary "signal on/off" type of thing?
  4. What if we're close to 2 or more tagged objects?
  5. Does the output need to accurately reflect the object (e.g. a noise like a phone ringing and a picture of a phone when you're close to the phone) or can we keep it simple and just have the tag related to a note and screen section, and let the user learn which is which? Do we actually need to differentiate at all, or can we just tell the user they are close to an object, and let them find out which one and how many?
  6. How do we deal with objects that have been picked up already, and are presumably close to the hand-held all the time?
  7. If we are representing multiple different objects, do we allow the user to select one to focus on? If so, how do we manage that? On the screen it would be easy to select something, but can we be clever with the noise too? (E.g. if we're using a different note for each item, can we let the users slide something up and down to select between high and low pitches? How do we go between searching for a single item and many? Is that over complicating things?)
So yeah. We've got a few things to think about. How we demo it will also be interesting. But I think we'll end up having to carry a laptop around to demo. I'm going to look up some stuff on torches for the blind that people were talking about a couple of seminars ago, see how they represent distance. Give it a go, anyway.


Article originally appeared on Life on Mars (
See website for complete article licensing information.