Tag Archives: iPhone

iPad, Multi-touch interaction and Subject-centric computing

I am very excited about iPad. It makes multi-touch interaction mainstream. iPad revives and introduces new generation of developers and users to the idea of “direct object manipulation” which is one of the key concepts of Subject-centric computing.

On the surface it looks like iPad (and iPhone) application-centric model (with thousands various Apps) is in contradiction with application-less model of Subject-centric computing. In reality, applications developed for iPad and iPhone are often can be considered as “functions” which can be combined to provide subject-centric experience.

What is missing? Integration with global knowledge map that can be used from various apps and a mechanism to pass subject context that allows to launch/continue applications in specific subject context.

Apple provides support for embedding geo maps into any application using MapKit framework. Let’s assume for a minute that Apple creates SubjectKit. This (imaginary) framework provides access to information collected in global sources such as Freebase, Wikipedia the same way MapKit provides access to Google Maps. In this case iPad and iPhone applications can leverage information collected in global knowledge map. SubjectKit also can allow applications to record current subject context in some shared storage available for all apps. When a user launches an app, this app can read current subject context and use it to provide subject-centric experience.

Let’s take a look at iTunes, for example. It simplifies “buying experience”, but it is not currently integrated with global knowledge map. We often need to launch a browser and search to get additional info about subjects that we are interested in (movies, actors, directors, tracks, groups, …). iTunes has some reference data but it is quite minimal in comparison with what we can get in global knowledge map, and this reference data is limited to iTunes app.

With (imaginary) SubjectKit iTunes (and any other app) on iPad can leverage information available in global knowledge map directly without manual search.

What about leveraging subject context between various applications? Let’s say I open my Weather app and check weather in New York City. Weather app can record that one of my currently active subjects has identifier http://en.wikipedia.org/wiki/New_York_City. Let’s assume that as a next step I launch iTunes. iTunes (in my imaginary scenario) can retrieve this subject identifier from shared context storage. If I click on “Movies” tab then iTunes can suggest, for example, “Sleepless in Seattle” movie in the new “Related to your active subjects” section. iTunes can do it because (in my imaginary scenario) it can leverage current subject context, internal iTunes database and global knowledge map.

With sharing subject context between apps comes an issue of protecting user privacy. SubjectKit framework can maintain “white list” of apps which are allowed to save into and restore from current subject context. SubjectKit can block applications that try to access subject context if they are not allowed to see it. SubjectKit also can prevent applications to save active subjects into shared subject context if they are not allowed to do so.

Of course, this approach is not limited to OS X and iPad. It’s just combination of iPad design, powerful multi-touch interface and strength of OS X creates a winning platform for Subject-centric computing.

iPhone OS 3.0 – ready for Subject-centric computing

Apple just introduced iPhone OS 3.0 (beta) and 3.0 SDK. There are lots of improvements and new features. iPhone is a great platform for developing mobile applications. OS 3.0 makes it even more compelling for building Subject-centric solutions. One of my favorite features is Push Notification Service.

We introduced Subject-centric RSS feeds some time ago on Ontopedia PSI server . With RSS feeds in place, we can subscribe and monitor information about subjects that we are interested in using RSS (including mobile) aggregators. As Ontopedia user, I can submit an assertion, for example, that I am thinking about Blogging Vocabulary . Everyone with RSS subscription to Blogging Vocabulary or to my PSI will be notified about this new assertion.

But, of course, existing RSS aggregators and pull model do not allow to realize full potential of Subject-centric micro-blogging. Services like iPhone Push Notification Service are game changers. I wrote this blog post many years ago about Subject-centric real-time messaging. Now is the right time to implement it. And with new Apple iPhone SDK it should be fun.

Multi-touch interaction, iPhone and Subject-centric computing

If you follow news related to HCI (human-computer interaction), then you probably saw multi-touch interaction demonstrations by Jeff Han. You probably already use (or played) with iPhone or iPod touch. So you know what multi-touch interaction is about. This kind of interface goes hand in hand with Subject-centric computing. Why?

Multi-touch interaction promotes direct manipulation with various kinds of objects. iPhone follows more traditional application-centric paradigm (with smooth integration of different applications). On the other hand, Jeff Han demonstrated almost application-less interface. Not only “documents”, but “things” that we are interested in can be surfaced through multi-touch interface. People, places, events, driving routes, songs can be represented as “subjects” in multi-touch interface and we can easily (naturally) interact with them. That is the way we would like to interact in subject-centric computing environment.

Multi-touch interface translates gesture-based interactions into operations on subjects (“things” that we are interested in). Subject-centric infrastructure can help to implement ‘glue’ that allows to identify and interconnect subjects “hosted” by various applications/services on desktops, intranets and the Internet.