NeuralPhone: A Brain to Smartphone Interface

here’s a growing interest in new handsfree interfaces for smartphones based on voice and face recognition systems. We developed the EyePhone, which lets the user select and activate applications with the blink of an eye. We then won  dered if a thought could also drive a smartphone application and it turns out it can.

Until recently, devices for detecting neural signals were costly, bulky, and fragile. We developed the Neural  Phone, which uses neural signals to drive applications on the iPhone using inexpensive off-the-shelf wireless electro encephalog raphy (E EG) headsets (see Figure ).  


a brain to interspace smartphone


The phone flashes a sequence of photos of contacts from the address book, and a P300 brain potential is elicited when the flashed photo matches the person whom the user wishes to dial. EEG sig nals from the headset are transmitted wirelessly to an iPhone, which natively runs a lightweight classifier to discrimi nate P300 signals from noise. When a person’s contact photo triggers a P300, his or her phone number is automati cally dialed. NeuralPhone breaks new ground as a brain to smartphone inter
face for pervasive computing.

Community Similarity Networks: 

Big Sensor Data Big data presents both challenges and opportunities. Today, we’re seeing the emergence of new mobile health, well being, and self quantification apps that can automatically generate large numbers of sensor data streams. These streams are stored on phones and in the cloud for further mining, sharing, and visualization.  

The BeWell application, for example, doesn’t send raw data to back end servers; rather, it uploads features, inferences, scores, and usability data to the cloud if the user opts to store and view longitudinal data. A typical BeWell user will upload 20 Mbytes of data per day when his or her phone is charging and connected to the Internet. Continuous sensing applications will gain popularity, producing terabytes of data that will need to be stored and processed in the cloud and potentially shared on social networks. 

As the user population of smartphone sensing apps grows, the differences between people will quickly degrade the accuracy of the classification system we call this the population diversity problem . For example, how a young child walks differs greatly from how an elderly person walks, so the same model can’t be used. To address this problem, we developed Community Similarity Networks (CSN), a classification system that can be incorporated into smartphone sensing apps to address the challenge of building robust classifiers for diverse populations.

by: Meyda Azzahra