Natural (Language) Interfaces

This blog post is not about Siri, sorry. I remember when the best way to control a games console was like this: But over the years, controllers started to look like this: From a single red button to a plethora of buttons, triggers, D-pads, joysticks, joysticks which act as buttons and switches, it’s no wonder … Continue reading “Natural (Language) Interfaces”

This blog post is not about Siri, sorry.

I remember when the best way to control a games console was like this:

But over the years, controllers started to look like this:

From a single red button to a plethora of buttons, triggers, D-pads, joysticks, joysticks which act as buttons and switches, it’s no wonder that there was a bit of a “revolution” when this hit the market:

But everyone has been a little fascinated with this for the last couple of years. And not surprisingly – this is one of the interfaces that we use to control the world. It seems natural to use it for direct manipulation.

And despite the fact that the hardware is obviously capable of it, games designers haven’t been making use of one of the other obvious interfaces. One that we humans excel at.

This isn’t the same as using a headset to bark commands at team-members, but using defined commands to instruct a game element. Yes, these games exist (Shouter being one of the most well-known) but the sophistication is low.

What I’m looking for is the difference between Newton and Palm, but in terms of voice. Newton tried to recognise your handwriting while Palm made you learn a certain alphabet. For games, at this stage, we need to create a basic control set that can be easily recognised by a language processor. Whether that is in understanding actual words or whether it is mapping wave patterns – it doesn’t matter. The point is to use our voice to control games.

The instructions can be short, they can be words, they can be screams and cries. When I call “Retreat”, my units should start to retreat back to base, making a tactical withdrawal. When I order “Advance”, they should use cover and opportunity to advance upon the enemy position. And when I shout “Charge”, you get the idea.

(images not used with permission)

Where are the apps we’ve been waiting for?

It has been nearly a year since I first came in close contact with the original iPad. It blew my mind, and since then, it has become a daily accompaniment. I create content on my MacBook Air, but I spend a lot of time consuming content and media on the device. In fact, if I … Continue reading “Where are the apps we’ve been waiting for?”

It has been nearly a year since I first came in close contact with the original iPad. It blew my mind, and since then, it has become a daily accompaniment. I create content on my MacBook Air, but I spend a lot of time consuming content and media on the device. In fact, if I had to guess, I use my iPad as much as I use my notebook computer.

However, if iPad, the device, is more magical, the applications (apps) for the device are anything but. For nearly a year, I’ve been waiting (and waiting) for experiences befitting the device and its hardware capabilities. – OM Malik, GigaOm

I concur. I’m still waiting for the amazing experiences that we think we deserve when we tote around such amazing hardware. And if that goes for iPad, then it goes double (or maybe tenfold) for Android, WebOS and anything else out there.

We do have time, however. The mouse went from humble beginnings in academic and commercial research in the 60s to initial release with the Macintosh in 1984 and it’s still probably the major input metaphor for computers in existence. We have been poking at our computer screens with a single fingertip, the mouse cursor, for over two decades. While we all like the look of the future with multi-touch (and from the Kinect, zero-touch) interfaces, we still await the apps which will fulfil this promise to us. New touch-based methods to consume old style media ain’t a big deal.

We want mobile-optimised hyper-local-aware software, designed for touch and equipped with contextual understanding and social-network awareness so we can get the most personalised experience.

And this is the low bar.

If we can think of apps that can fill this criteria in minutes, imagine what we could do if we were in that business, if our job was to not only talk about the next big thing but be part of the team creating it.

from Virtual Reality to Augmented Reality

Published in 1993, Cybergeneration was a radical departure from R.Talsorian Games’ previous successful Cyberpunk line. Whereas the latter focussed on style over substance, high calibre firearms and heavily armoured Solos backed up by geeky NetRunners, CyberGeneration focussed on the kids of these embittered mercenaries and endowed them with nanotech-derived superpowers. In Cybergeneration, the Net became … Continue reading “from Virtual Reality to Augmented Reality”

Published in 1993, Cybergeneration was a radical departure from R.Talsorian Games’ previous successful Cyberpunk line. Whereas the latter focussed on style over substance, high calibre firearms and heavily armoured Solos backed up by geeky NetRunners, CyberGeneration focussed on the kids of these embittered mercenaries and endowed them with nanotech-derived superpowers.

In Cybergeneration, the Net became actualised from Virtual Reality to Augmented Reality. Individuals would see augmented reality objects as readily as they would see real world objects – 3D objects in realspace. At the time I knew that IP addresses were ‘geotagged’ but this was long before we realised that GPS units could be embedded into superslim phones that were always net-connected.

photo

This week we also saw Metaplace, one of the many 3D Virtual Worlds with User Generated Content, fall by the wayside. “It raised $9.4 million over two rounds of funding with that goal in mind, managing to get the buy-in from new investors Marc Andreessen and Ben Horowitz last October.” but they just announced their closure on Jan 1st, 2010. I think this is typical because Virtual Worlds require you to sit in front of a computer and limit your interaction through keyboard and mouse pointer. In a 3D world, the mouse pointer becomes a single fingertip by which you interact with the world. For Augmented Reality, we have to avoid the mobile phone screen becoming a keyhole by which we view the world. We have to be able to touch it and to hear it.

Earlier this week, Edo Segal, write a guest post on Techcrunch describing a cyberpunk story he wrote 16 years ago which involved augmented reality and I’d hesitate to link this with Cybergeneration (despite the identical publishing year).

Edo reckons the building blocks of an augmented reality system have to be more than we currently have, which amounts to little more than search. He sees the four main blocks as being:

  1. Realtime Web (Twitter, news flows, world events, and other information which relates to changes in the world)
  2. Published Information (sites, blogs, Wikipedia, etc.)
  3. Geolocation Data (your location and information layers related to it, including your past locations and that of your friends, as well as geo-tagged media)
  4. Social Communications (social graph updates, IMs, emails, text messages, and other forms of signal from your friends).

and he handily provides a diagram.

ambientcircles

but he says something in the Techcrunch post which resonates:

One only needs look at a teenager today as they do their homework, watch TV, play a game, and chat while watching their Facebook stream to get a sense for humanity’s expanding affinity to consume ambient streams. Their young minds are constanty tuning and adapting to an age of hypertasking.

and I reckon that this is being unfair to some of us oldies. In March of this year, I met with Ewan McIntosh, one of the 4IP commissioners and part of the round-table chat included the admission that he watches TV with a laptop on his lap and a mobile phone on the arm of the chair beside him. This is how my household watches TV. The concept of not being connected while consuming information is alien to me. I want to look around the periphery of it, I want to dig deeper and, at the moment, technology is failing me.

I’ve said this before and I’ll say it again. Augmented Reality at the moment is a sham. It’s all search and toys. Either you’re pulling geotagged information from one of the search engines or content silos (and I include Wikipedia here) or you’re using geotags and fiduciary markers to drop toys here and there. It’s all smoke and mirrors. It’s not enough to be just tagging stuff and re-presenting it as your own. There has to be something novel coming out of it – even if it’s just the presentation of context.

The CyberGeneration example is a good one. Their solution, Virtuality, included the presentation of interactive tools in Virtuality – whether those be musical instruments or even keyboards and computers. We’re touching the edges of a world where hardware itself is virtualised and made into software. We’ve already seen this done in something as simple as the ‘Compass’ app on the iPhone – it’s a virtualised hardware solution projected onto a multi-purpose handheld display. At some point we’ll figure out how to internalise that display and I reckon that MIT’s Sixth Sense technology is probably one way of doing it (though I guess that having a projector on your chest is a limitation of the technology we’re currently using rather than a potential prototype)/

In early 2010, I’d like to invite some of the other players in AR-related technology in Northern Ireland, like Awakin, Filmtrip, the Design Zoo, ReDisc0very, Ulster MediaScapes and others to have a “DevDays” type event where we talk about the very possibilities of augmented reality solutions. Some of them read this blog, some don’t – but I’d like to hear of other folk in the province (and beyond) who are interested i talking and/or presenting in a BarCamp-esque situation.

Accessibility as paradigm

Posts like this remind me why I never settled for something “in the meantime” for my co-working plans. Martin Pilkington (of MCubedSoftware) wrote about The Accessible Mac: I’ve not got a disability that limits my ability to use the computer and don’t personally know anyone with one, but I do have a strong sense that … Continue reading “Accessibility as paradigm”

Posts like this remind me why I never settled for something “in the meantime” for my co-working plans.

Martin Pilkington (of MCubedSoftware) wrote about The Accessible Mac:

I’ve not got a disability that limits my ability to use the computer and don’t personally know anyone with one, but I do have a strong sense that we should all be treated equal and have equal opportunities. When we can do something to help someone else and it is very cheap or very easy to do, then we should do it. Making accessible applications can be very easy and not take too much time, but can make a world of difference to some people.

Working with my Dad on the Mac highlighted some of the difficulties to me. My Dad is on the Blind persons register as well as having severe motor function loss due to nerve damage. Despite this he soldiers on, maintaining a Mac mini hooked up to a commodity LCD TV. I use BackToMyMac every now and then to guide him through some steps or fix an actual technical issue (iTunes is a PITA for blind folk) but he manages quite well with a combination of VoiceOver and holding down Control while Scrolling (which on the Mac, zooms the interface – useful tip for those of us who aren’t nearsighted too). We don’t have great haptics on computers – some engineers build a vibrating component into a tool and we call it haptic feedback. No. It’s. Not. A vibration without some sort of context is useless. We need to be thinking bigger than that – I’m faintly disappointed to learn that the new iPod Shuffle gets all of it’s text-to-speech conversions from the hosting Mac or PC and it’s not just a widget inside the tiny device. That said – Apple had Text-to-Speech working on the Newton over a decade ago (and I’ll save my rant about how cool the Routing menu was for another day). We shouldn’t rely on visuals and a vibration in a device to provide our interfaces when we have computers which are capable of much more.

The point being – my Dad isn’t even an extreme case. We have so many examples of digital ‘illiteracy’ due to poverty, ability, experience, fear and yet every day we hear about new services, new applications. There are times when I feel a little threadbare, stretched across a frame due to having so many inputs and outlets (and no, I’m not talking about InterfaceBuilder and XCode here). That feeling happens to me every couple of months, I can’t imagine what it must be like every single day to be confronted with this and not have all five senses and a brain that’s been pretty much wired into the Internet for nearly two decades.

Great design creates new data.

Scott Stevenson writes: Visual design is often the polar opposite of engineering: trading hard edges for subjective decisions based on gut feelings and personal experiences. It’s messy, unpredictable, and notoriously hard to measure. The apparently erratic behavior of artists drives engineers bananas. Their decisions seem arbitrary and risk everything with no guaranteed benefit. … An … Continue reading “Great design creates new data.”

Scott Stevenson writes:

Visual design is often the polar opposite of engineering: trading hard edges for subjective decisions based on gut feelings and personal experiences. It’s messy, unpredictable, and notoriously hard to measure. The apparently erratic behavior of artists drives engineers bananas. Their decisions seem arbitrary and risk everything with no guaranteed benefit.

An experienced designer knows that humans do not operate solely on reason and logic. They’re heavily influenced by emotions and perceptions. Even more frustratingly, they often lie to you about their reactions because they don’t want to be seen as imperfect.

and in the comments are some more excellent soundbites

Rob Morris writes:

…exceptional design has ideals, integrity and vision. It listens and is informed by its users, but sometimes more importantly, it knows better.

Doug writes:

Great design creates new data. Design is creative, not reactive

Two weeks ago I met Jonathan Ive. Ive is SVP of Industrial Design at Apple. He’s credited with some of Apple’s design triumphs: the eMate, the iBook, the iMac, PowerBook G4, iPod, iPhone, Mac mini and a raft of others. He said his team is small but they’ve been working together for a very long time now – something that affords great understanding between them. Ive seems a quiet and humble bloke, but his presence and passion were able to shine through in the brief meeting – his volume increasing as he became more passionate about the subject. This bloke, from the same part of the country as David Beckham, was voted by the Daily Telegraph as being more influential than Beckham (which probably says more about how out of touch famous footballers are with the rank and file).

I love how some of the designs I like inspire strong feelings in myself and others. Exceptional design should inspire polarity of thought – you should be in love with it or hate it – it should, by it’s very name, be an exception. This is subtly different from ‘the most usable design’ of course, which should slot into your own user model so easily that you barely notice it. Great design in interfaces can also polarise but even the worst reaction should acknowledge the attention to detail in the user model. This is something that, again, Apple does well. It’s always been a medium where Apple has changed things incrementally and when they have perhaps taken a step backwards (like Mac OS X Public Beta) it was most definitely a ‘girding of our loins’, a ‘hitching of our skirts’ so we could better witness and experience the changes going forward.

The Sixth Sense?

The theory goes that our technology can enable us to have a sixth sense. With the computer I keep in my pocket, I can find directions, addresses of local businesses and amenities, restaurant reviews Pattie Maes of MIT demoed this at TED, from the Fluid Interfaces Group: The flashy bit is actually the least important … Continue reading “The Sixth Sense?”

The theory goes that our technology can enable us to have a sixth sense. With the computer I keep in my pocket, I can find directions, addresses of local businesses and amenities, restaurant reviews

Pattie Maes of MIT demoed this at TED, from the Fluid Interfaces Group:

The flashy bit is actually the least important part of the technology – it’s not about the projection mechanism – it’s more about the interpretation algorithm. Software that understands context.

While I have seen ‘wearable’ interfaces before, they tend to look a little dorky – but then we all thought Lieutenant Uhura was odd with her earpiece and now every second car driver wears a bluetooth earpiece. Why the earpiece is not acceptable as pedestrian wear is beyond me. It would seem to be obvious.

And with the recent innovation of adding voiceover to one of the smallest music players on the market – the player reads the text of MP3 tags to you – it’s not long before we can expect something similar in the iPhone and other smartphones. Receive a text – yeah, just read it to me. Email, yup. Tweets, sure. Tell me that your battery is low. Give me email filters to read out any message marked urgent or those from certain people.

I don’t need a projector around my neck – I want something which will, for example, use Bluetooth to identify folk I meet (gee, need some handy peer-peer tech there), immediately fetch me their social data from Linkedin, Facebook, Twitter or whatever is popular, and then feed that information into my earhole so I know what to say. Yes, there’s technical difficulties here (though a lot of interface choices could be handled with a chording approach or using the thumb to select from four finger choices).

I think the problem is that I’ve ranted about this before. I want a Ghost in the machine which will handle some of the mundanes for me. I want it to feed me with more info, on demand. And what’s best – this is all UI. It’s about putting a more human interface onto existing technology. At the moment we only really use our eyes for data and we respond to much more than just visuals. When we watch a movie, it doesn’t matter if the screen is huge or fits in our pocket as long as the sound is good. We need to be using the aural sense a lot more and not just for indignant beeps.

It’s not the projector round your neck or the attached camera. That’s information and interface without privacy and exclusivity. I’ll happily carry round my iPhone if this was available – I already carry it everywhere.

Heck, I’d even settle for something that could just tell me yes or no.

MoMoBelfast and the Apps Show and Tell

Graham weighs in on Windows Mobile 6.5 Throughout the talk, the words innovation, interaction and user experience are repeated, however just saying the words does not make it true. I found no presence of innovation in Windows Mobile 6.5, it definitely seems like they’ve tried to bolt on touch capability to their existing OS. I … Continue reading “MoMoBelfast and the Apps Show and Tell”

Graham weighs in on Windows Mobile 6.5

Throughout the talk, the words innovation, interaction and user experience are repeated, however just saying the words does not make it true. I found no presence of innovation in Windows Mobile 6.5, it definitely seems like they’ve tried to bolt on touch capability to their existing OS. I found more innovation in Apple’s iPhone cut and paste feature than in the whole of this Windows Mobile demo.

I hate to say this but Apple has the industry in catch-up mode again. It’s easy to get labelled as a fanboy but every second headlines seems to be about how the iPhone is brilliant or how some new phone/platform will kill it. Either way it has huge amounts of mindshare.

This evening we had a bit of a treat and were able to attend Mobile Monday Belfast’s Show and Tell for mobile apps. The auditorium had a good number of folk in there and the demos presented all had something unique to offer. I had to speak for five minutes at the start about the, until recently top secret, iPhone initiative and then we got into the demos proper.

EyeSpyFX – Anthony Hutton was on stage demonstrating his webcam viewer app which is available for 15 varieties of mobile phone in addition to the iPhone. His demo, slowed only by the really poor reception in the building, was impressive. Anton’s most memorable statements were regarding the economics of developing apps for mobile phones and the iPhone in particular. He claimed that developing for the iPhone was a fussy affair – requiring a Mac, the developer license and an iPhone to test on – quite a significant outlay for a startup with no prior Mac experience. But he said that development was quick and easy, getting the app onto the store and support documentation were excellent. He also commented that his apps sell on standard JavaME platforms and the operators and aggregators normally charge €6 for the app and he would get maybe €1 of this and in the cases of some aggregators, maybe even just €0.30 per copy. On the iPhone, Apple takes 30% of the revenue but as his app costs £2.99, it means he pockets over £2.00 per copy. And, in his own words, despite there being fewer iPhones out there, buying apps for JavaME phones is a pain, and his iPhone sales numbers have been four times the sales of his JavaME apps.

Anthony Hutton, EyeSpyFX
Anthony Hutton, EyeSpyFX

The next demo was John Martin from Total Mobile – a Windows Mobile developer squarely ensconced in Windows land and with strong sales in case management (by all accounts they’re a Consilium spin-out?). Their user interface was very Windows Mobile and people used to that would feel very at home. Speaking afterwards, I found John to be very personable and enjoyed his opinions of his various mobile devices (which included a HTC Advantage and a Redfly ‘unit’.

Next up was Ryan Cushnahan with his GAAStats Windows Mobile app. While his user interface was very basic, the use-case for the software was very strong. He licenses the software for £400, which seems steep compared to AppStore pricing but it’s a niche product by someone who knows his game. I think Ryan might be a good candidate for the getting a UI makeover!

I then went on stage and did a quick demo of three iPhone apps from ‘local’ developers. The first was Pocket Universe from Craic Design – one of the best astronomy apps for the iPhone. John’s pedigree includes doing similar apps for Windows mobile. I also gave a minute to his other iPhone release, ShootEmUp and just tonight I found out about his free Animal Track kids game, devised by his 9 year old daughter.
Next, I demo’ed Locle mini from Dublin based social networking startup, Locle. Locle is a simple app currently utilising a web view for most of their user interface but a little birdie tells me that their sales have meant they’re able to get a more native version on iPhone.
The third was close to my heart, EyeCandy Comics from Blue Pilot Software. I also made passing reference to a new service called Infurious Republic when I was asked when the rank and file would be able to get their stuff online.

Lastly, and in the door by the skin of his teeth, came Rory from Ammeon. Again the poor reception and lack of WiFi killed some of the demo but there was enough to get the gist of it. Commune effectively allows an operator to create a custom TV station with their own content which is able to be viewed over a mobile link and has a small degree of social network in a comments system attached to each video.

In the conversations after, I was explaining that my interest in the iPhone Initiative was to find digital content companies which were interested in developing skills in mobile interface design – that skills which were developed for iPhone, darling of the media, would easily port to Windows Mobile, Palm or Blackberry when the time came. It was then someone commented about the Windows Mobile offerings, that the marke share for Windows Mobile far exceeds that of the iPhone. The commenter was a dyed-in-the-wool Windows guy (I first met him over a decade ago when he was working for a DELL reseller and was trying to tell me IIS was better than Apache or Netscape Suitespot Servers). I hear you – but so many of those devices are dumb terminals, they’re used as barcode scanners, handheld credit card scanners – it’s a different market and they’re not going to ever run interesting software. It was an odd statement – really rang as defensive – and seemed particularly odd considering Anthony Huttons comment that his sales of iPhone apps far exceeded his sales of apps for the other platforms he supports: JavaME, Blackberry and Windows Mobile. In essence, while there may be more out there, they ain’t buying apps.

All in all, the night was a resounding success for Norbert and Colin, both of whom put a lot of work into Mobile Monday in Belfast. Next month they’ve got someone from Mozilla Mobile coming in and a whole new raft of interesting stuff to learn about.

Usability Rant #UXFAIL #ROFL #ZOMG

When User Experience experts have a “UXFAIL” event on their website, it gives me a little smile. Reading Christian Lindholm’s blog about Mobile User Experiences, I felt urged to comment. Quick tap, hit Post and … So, what’s wrong with my data? Is it because I’m Irish? Related posts: ERP! Pardon me! Wherein I ridicule … Continue reading “Usability Rant #UXFAIL #ROFL #ZOMG”

When User Experience experts have a “UXFAIL” event on their website, it gives me a little smile.

Reading Christian Lindholm’s blog about Mobile User Experiences, I felt urged to comment. Quick tap, hit Post and …

picture-11

So, what’s wrong with my data?

Is it because I’m Irish?

Dropship showing what good UI is…

This is a video of ngmoco’s DropShip game for iPhone. I’m fascinated by the controls – described as ‘touch-anywhere, dual-analog’. The interface for the controls pops up where you place your fingers and the control (direction) of the thrust or weapons fire is directed by a short drag of your finger. It’s simple, it’s effective … Continue reading “Dropship showing what good UI is…”

This is a video of ngmoco’s DropShip game for iPhone.

I’m fascinated by the controls – described as ‘touch-anywhere, dual-analog’. The interface for the controls pops up where you place your fingers and the control (direction) of the thrust or weapons fire is directed by a short drag of your finger. It’s simple, it’s effective and it looks fucking amazing.

Well done, ngmoco.