Showing posts with label Droidcon. Show all posts
Showing posts with label Droidcon. Show all posts

Thursday, September 22, 2016

Droidcon Moscow 2016


Today, on September 22-nd 2016, the 3-rd Droidcon conference was help in Moscow. This is a great opportunity for russian developers to get acquainted with latest Android trends and technologies, especially taking into account that Google I/O has its residence at other side of the Earth. ;-)

Of course Android 7.0 was presented as a new target for development, as well as many other topics, including usability, testing and even piracy on mobile market. Two sections were dedicated to VR and IoT. Although these terms are well established, and some products such as Google Cadrboard are very popular, they are seemingly still waiting for greater adoption and maturity. At least, I'm not impressed neither by next generation VR headset Daydream, nor by smartness and value of connected "things". The former looks too large for me and lacks natural human-computer interface (it still utilizes old plain extended manual control, obviously called "controller"), the latters are no more than toys at the moment (imho). I must admit that I did not see or test Daydream in real life, so my impressions are based solely on discrepancies between what I wanted and what I can get according to available information. Anyway, it's good to know about the work in progress, and I'm eagerly anticipating next releases.

Unfortunately this time there was no a hardware exhibition as a part of the event, so the only things that could be touched by hands and twiddled with came with speakers and were few.

Saturday, April 12, 2014

Droidcon Moscow 2014: r-evolution of wearables


Today I'm going to post some thoughts on a subject which may seem offtopic here, but only at a first glance. Indeed, it's about Android ecosystem, and thus embraces Android-based applications, including those related to indoor, outdoor, or whatever-else-door positioning.

The main news is that I attended Droidcon Moscow 2014 today. There is no surprise that current trends were covered: wearables, augmented reality, virtual presence, and context awareness.

Among all evolving products, which comprises these technologies, one is most noticable: glasses.

Google glasses are definitely the most widely known and promoted example. They are surrounded with enthusiastic boom and provide a truly new user experience. Yet I have some critical considerations.

First of all, I don't completely agree with Google's design. Their decision was to make the active part of glasses as a single eyeglass. The working area covers only a little part of visual field, and a user must squint in order to read info. I know this was the deliberate decision to not distract user from reality. The problem with this approach is that "real reality", augmented reality, and virtual reality are combining too fast and displaying virtual information over real world becomes a real, most important, value. As for distracting user from reality - this is just a matter of switching one display mode or another (such as completely transparent). Anyway, I'd prefer to have a glasses with both oculars and see them in front of eyes.

The second thought about Google glasses concerns another point: the glasses are actually a thin client (UI) for server-based apps. The apps which the glasses are capable of running are very lightweight. It's not possible to code somewhat computationally complicated, because the glasses discharge quickly and become hot (at your face's skin).

The third point is that Google glasses require special coding. The apps for the glasses are not plain old Android apps. On the other hand, if I have, for example, an Android application for augmented reality which runs on phones and tablets, I'd like to port it to glasses with minimal or even no efforts.

All that said, it's time to introduce another glasses - Epson Moverio BT-200. This gadget presents full-fledged, so to speak, glasses with two active oculars. Moreover, they are powered with real Android OS and can run native Android apps. This is good news. The bad news is that they made of 2 parts: the glasses themselves and a "system block" with touchpad, connected by a pretty thick wire. My user experience was not so good. It feels really unnatural to move fingers over the touchpad (somewhere in "background") for cursor control. I'd prefer that an eyemovement detection would have been built into the gadget. It does exists in other products and fits into such glasses metaphor very well - cursor just follows sightline.

Ok, let us admit that the outboard "system block" is the only solution possible nowadays to support full power of an Android device, but the touchpad, as a primary control, is inappropriate. As long as this block should exit due to current technical limitations, I'd made it as a normal Android device with touch screen. So, it can be used as is without glasses. And if glasses are connected, they use all the power of the device and adds new eye-tracking control experience, including movements, winks, etc. And as you may notice, this sounds like the (Epson) glasses could be implemented as stand-alone detachable gadget, which could be plugged into any Android device! That would be nice. But this is not all the story.

Even if the eye-tracking would be incorporated into the glasses, there is something more that could add next level of control and interaction. When I'm looking at the glasses, the brain-computer interface comes to my mind. For example, Emotiv devices just asks to be embedded into the glasses. Structurally all we need is to add some contacts on the inner side of the rim. Software part is not a problem at all.

Bottom line. I think the advent of virtual glasses is exciting as phenomenon, but existing models lack many features which could dramatically improve usability, and more importantly, all this is doable right now. So, I'm waiting for the evolution or revolution, as you like. Hope it'll happen soon.