Cluster Grid

Font Size

SCREEN

Layout

Menu Style

Cpanel
Error
  • Error loading feed data

Viewlity V2.0 - Diploma Project

Viewlity icon



1. Overview

Viewlity is an Augmented Reality Engine for showing nearby points of interest on your Android phone. It's a powerful, easy-to-use tool for discovering places around you, finding the nearest fuel stations, coffee-shops, restaurants, ATMs, subway stations, places of worship and many other points of interest. You can also locate them on Google Maps, in order to get a larger point of view.

The experience of Viewlity is a different story for each user. By just looking though the camera and moving your Android phone left or right, Viewlity enriches the world that surrounds you.

Read more:

LogSync

Project Overviewandroid1

 

For the early past Android platform has become one of the most developed platform for mobile devices and tablets. Big companies, such as Samsung,  Motorola, Sony-Ericson, LG, HTC and others, have pointed their attention on the Google’s Android Operating System making it a very serious concurrent for its competitors iOS (Apple) and Windows Phone (Microsoft).

 

Android LogSync is a client-server application that permits call and message logs synchronization with a server in order to instantly alert users when they don’t carry their mobile phones with them. If a user forgets or leaves on purpose its phone in a place, he can access his missed calls and messages through a Web-Interface or through a Widget installed on another mobile device with Android, that will connect automatically and receive data from server. LogSync application consists of three parts. The first one, is a module that is alerted by the Notification Manager when it receives a call or message. It then establish a secure connection and transmits new data to a server. The server, which is the second module, has a MySQL database where it stores all information regarding each user. The last module is an App Widget than runs on another mobile device desktop and keeps the user informed with lately logs. Further developments involve remotely shutdown/block device if stolen/lost, remotely call forwarding and integrating the application on other mobile platforms (Windows Phone 7, iOS 5).

 

Functionality

 

Untitled2

System Architecture

 

LogSyncArch1ENG

  • Client send new data to Server
  • Widget receive data from Server and display to User
  • Server establish connections with clients/widgets and store data; host a web interface
    • The MySQL database content can be accessed by using PHP methods (from the web-interface) or by using a JDBC driver that allows JAVA methods to query the database.
    • For the web-interface I used a WAMP server that allowed me to combine an Apache server with PHP scripts and MySQL.
    • The communication between server and clients/widgets is encrypted (SSL protocol)

 

Application Screenshots 

client-screenshotclient-screenshot2client-screenshot3widget-screenshotwidget-screenshot2

Results

  • Demo: http://www.youtube.com/watch?v=p8PuzWO6YF0
  • functional Client and Widget application that poll the server to send/receive data after an user defined refresh interval
  • functional Server with web-interface and 2 Java classes that handle Clients/Widgets

Further Development

  • C2DM framework to push notifications
  • Google App Engine for hosting the server
  • Porting on other platforms

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Plant Recognition

1. Introduction
The basic purpose of the application is to help the user take care of plants, by sending notifications and offering plant care tips. The problem arises when we don’t really know what kind of plant the application is supposed to give advice on and since the target users are the ones that do not have the knowledge or the inclination to obtain this information the app has a way of finding out for them: the leaf recognition module.

2. Structure
From the image of a leaf obtained from the camera we obtain a series of 12 features after passing the leaf image through 9 transformations. The extracted features are not processed on the mobile phone, but sent to the server build on Google’s App Engine were they are passed through a Probabilistic Neural Network that classifies the image.


In order to reduce the size of data that the Neural Network needs to retain and to improve the time spent identifying the closest match Principal Component Analysis is used. This statistical technique finds patterns in the data and identifies the most significant directions of a dataset enabling the developer to strip the useless information. After performing this technique we can reduce the number of features obtained after Image Processing from 12 to 5.

The feature classification is done using the Probabilistic Neural Network, a popular choice in pattern recognition. Neural Networks are known for speed and sound statistical basis and the k-Nearest Neighbor algorithm is easy to initialize and has high accuracy. By combining the two we obtain the PNN that has the best

3. AppEngine

The discussion on the matter of image processing is delicate since the choice does not seem to be a simple one. Some would argue that the best idea would have been to run the entire image processing on the App Engine and worry about hitting the imposed limitations when this happens or even that a homemade serve would have sufficed, but the choice has been made considering the potential capital investments and the state of the competition and with the hope that this is the best choice.

4. Conclusion

In the end we have obtained a routine that recognizes a leaf from a set of 20 known leafs with an accuracy of 80%. Although it is a bit smaller than other more complex algorithm the confidence of having a good basis can translate into a better application next time.

Motion Processing For Mobile Devices Using Embedded Sensors

Description

This project consists of two applications: one developed for an Android based mobile phone and one which is to be installed on a computer. It allows the user to make use of its phone as a PC mouse by moving it in the air. The ‘mouse’ performs basic functions as right click, left click, scroll and drag.

Prerequisites

  • Android mobile phone with gyroscope and accelerometer v.2.3+
  • WiFi
  • Phone and PC in the same local network

Project Overview

We use two types of sensors embedded in the mobile device: the accelerometer and the gyroscope. The Android application handles the processing of information acquired from these sensors, and then sends the resulting data to the computer via wireless. The computer runs a server application written in Java which receives data from the phone and then uses it to actually move the mouse pointer accordingly.

The PC Application

It is written in Java and it communicates with the Android application by the means of UDP sockets. It performs the movement of the pointer using the Robot class from Java. 

The Android Application

Processing the movement of the phone consists of several steps: data acquisition, data filtering for noise and applying the movement algorithm. On one hand, these have involved working with the Android SDK development platform and, on the other, the understanding and Java implementation of complementary filters,  weighted filters, the strap-down algorithm, Euler angles and the Verlet integration method.


Future Developments

There are several improvements that can be made to this project which are related mainly to the mathematical and physical aspects of the project: the automatic calibration of sensors, replacing the integration method with a superior one, like Runge-Kutta, using quaternions instead of Euler angles to solve the gimbal lock problem.

The Interactive Control of a Robot with the Help of a Mobile Device

1. Project Overview

Description

The project consists of two applications - one developed for the Android platform and one for a LEGO Mindstorms device - which communicate through a Bluetooth connection established when starting the Android application. The purpose of connecting these two modules is to enable the two devices to communicate through messages according to special protocols - Bluetooth and Lego Communication Protocol. Therefore, the the Android user is offered the possibility to control the robot. The system is composed of a user interface displayed on the mobile device, a communication environment supported by special messages corresponding to each type of command available and an application for the robot - an artificial intelligence module - which allows it to choose its own moves during the game implemented.

Application Walkthrough

The basic principles of the project are illustrated by a clasic game in which the Android user and the robot are oponents - Tic Tac Toe. The project becomes interactive due to the fact that, besides displaying the game on the mobile device's display, the robot is able to draw the game on a sheet of a paper with a writing tool held by it.

After starting the application on the mobile device, the user is requested to grant permission to start the Bluetooth service. If access is granted, a discovery is made in order to detect the nearby visible NXT devices, which have Bluetooth turned on. A list with the matching devices is shown and the user can choose the device he/she wishes to connect to. The devices are shown by their MAC address. After the connection has been established, the user can choose from the main menu whether to start a new game, read information about the application or exit the application. 

After choosing the "New Game" option, the user can decide which player will go first. After this step, the robot begins to draw the game board, sending the proper message when finished.
The board is also displayed on the mobile device and the first move must be made. The general idea is that the robot has to draw the whole process. The robot can choose its own move due to an Alpha-Beta-pruning algorithm included in the leJOS project running on the device.  After the robot's new move, a proper message is sent towards the Android device so that the display is updated. The user can select "X" and "0" by swiping the symbols on the touchscreen.

When all the moves have been made, both on the mobile device's display and on the robot's LCD, a message is shown to announce the winner. In the Android application, a new screen is shown, to offer the user the possibility to start a new game or leave the application.