sábado, 18 de mayo de 2013

Neuromarketing

One of the trends developing as a result of EEG technology becoming more affordable is neuromarketing. Neuromarketing is a B2B industry operating in a new field of marketing research that studies consumers’ sensorimotor, cognitive and affective response to marketing stimuli. An example of a neuromaketing project is a study done by Professor Gregory Berns, who studied the brainwaves of a group of 27 teenagers while they listened to 120 unknown songs. His results showed that the songs that triggered the brains reward centers ended up doing better in the charts when they were released. 



Key to the success of this kind of predictive analysis is correlating the brain’s response to stimuli, namely images, music and videos. At time of writing, there are around 31 neuromarketing agencies in the US using this technology for market research purposes. The bulk of the work in this area is project-based and consultative in nature, done for major marketers such as P&G, Pepsi, Johnson & Johnson, Google and Mars during the pre-launch phase of their new product or service.

Here at IE Business School we decided to do our own neuromarketing. Take a look at some photos from our "mind reading focus group" below: 










Our study was for the Pantene hair care brand in Spain. We monitoring the subconscious thoughts of six women - three users of the brand and three non-users of the brand - while watching Pantenes latest TV ad (which you can view here).

We found that brand users pay a lot of attention to the hair of the model – they are comparing her hair with theirs and get frustrated because “it’s not as nice”. However, this frustration is overcome, as brand users are relaxed by the end frame and slogan – hair so healthy it shines. It is familiar, makes them feel good and they know the product already, so they don’t need to worry about what the bottle looks like. 

Conversely, non-users were relaxed by the frames featuring beautiful hair and paid attention to the end frame of the advertisement. This was part of a yearning for the spa-like environment, which then lead to heightened levels of attention at the end of the ad as they assessed the look of the actual product. 

Notably, all users paid attention to the animation of the vitamin capsules repairing the hair. They tended to take this message away, that the shampoo was full of vitamins to help make your hair soft. However, none could remember the variant of the brand – they could simply remember the brand itself. When asked if they would look for the variant in store, all said no. The ones who didn’t use it would not switch and the ones that did use Pantene did not feel the need to change their current formula. 

Both also loved the spa scenes. They felt relaxed and somewhat pampered by the luscious environment, with the featured waterfall being the most prominent images the women connected with. The very beginning of the ad, however, was a big fail. No one connected with the models pain of “crispy hair”. They could not recall this part of the ad, nor did these images register with them subconsciously. 

Hence, you can see how this type of research might become very beneficial for marketers and business strategy in the future!







martes, 7 de mayo de 2013

Researchers enable mind-to-email communication

Researchers at IE Business School have enabled mind-to-email communication by singing songs in their heads. The study showed that a unique combination of delta, theta, alpha, beta and gamma signals are produced for individual words when they are sung in the context of an already known song.

A snapshot from their report is shown below. In this case, an inexpensive EEG headset from Neurosky was attached to the head of 26 year old male. He was asked to sing the chorus of a well known song in his head repeatedly for one minute (he chose Boom! Shake the Room by Will Smith). Then, in order to ensure the brain waves differed depending on the words and not the tones in the song, the man was asked to sing the chorus again, using the same words but at different places in the song. The results are astonishing.



You can clearly see a difference between Boom / Room, Shake and The. What validates this research further is the extremely small difference between Boom and Room. The fact that similar words have similar EEG signals means that the make up of the words in terms of letters creates unique brain wave combinations. By coding each of these signal combinations and linking them to specific words, we are just an API away from transmitting words directly from our minds to our computers, smartphones and tablets. Which means that soon, you will literally be able to write an email just by thinking it!

martes, 30 de abril de 2013


WIN A PAIR OF GOOGLE GLASSES!*

This is for real people. To enter the draw, simply like and share this post with the following comment:

"These glasses will go great with my summer dress!" 

The winner will be the person with the highest amount of combined likes and shares on their post by 12PM Monday the 13th of May.

To enter, you must take a screen shot of your post and email into cforbes.imba2013@student.ie.edu.
Submissions received after 12PM Monday the 13th of May will not be accepted.

Good luck!



*Glasses to be delivered in time for Christmas Day, 2016


Brain computer interfaces to be mainstream by 2015



Google glasses are SO 2013! Forget nodding your head and blinking to control media in your eyes. By 2015, brain computer interfaces (BCIs) will be so widely used that a variety of apps will be available for your to control with your mind. We talked about Samsung's mind-tab a while back, but the problem with that was that you need to buy the BCI to use with it. Although it seems odd today that we'll all end up doing that, companies like Interaxon and Neurowear are developing stylish versions for us to wear in the everyday. But don't take our word for it; check out what the The New York Times has to say about the latest disruptive technology.



jueves, 25 de abril de 2013

Brain-Controlled Arm in the White House

Brain-Controlled Arm in the White House

Starting this blog, I realize the brain-controlled devices are not an imaginative sci-fi gadgets anymore but something technologically achievable and economically affordable. At the annual White House Science Fair, a 17-year-old boy introduced his creation: a mind-controlled prosthetic arm (see the link below).

http://techcrunch.com/2013/04/22/3-awesome-and-inspiring-inventions-from-the-white-house-science-fair/

The arm is controlled by his concentration (a pattern of brain wave) and eye blinks, moves smoothly, and shakes hands with him.


Furthermore, he sells it for $250, thanks to the low-cost modelling by a 3D printer; it's a collaboration of two cutting-edge technologies. So far the product appearance is rough and there are many rooms to be sophisticated, in my opinion. But no doubt the future is on this way!

martes, 23 de abril de 2013

BIG. NEWS.

Samsung is officially working on a system that allows you to control a tablet with your brain. It works by focusing on repetitive visual patterns.

This means that developing brain-controlled applications will become a scalable, investable business in no time. Which applications would you like to run hands free? Evernote? Spotify? Twitter? Google Maps? YouTube? Something else? Tell me!

PS: Check out the demo via the link below. The music app is kinda slow, but if you want to tell someone you need a change of clothes (perhaps as a result of hands free browsing...) it's as easy as ABC!

http://www.technologyreview.com/news/513861/samsung-demos-a-tablet-controlled-by-your-brain/



jueves, 18 de abril de 2013

Open Your Heart


Searching for something to write about in this week’s blog, I encountered an interesting story.

Caleb Forbes, our talented group mate (who is clearly very much into Human-Computer Interface), is taking part in a competition run by BusinessBecause website, asking MBA students around the world to predict how the world will look like when they retire, and how it is going to change.

Caleb, obviously a top finalist in the competition, predicts that in a couple of decades we will be able to communicate with each other and control the physical world around us through heart-to-heart communication, rather than face to face, phone to phone, or even brain-to-brain communication!


If true, do you think this might make our connections with others more intimate and sincere? Will this change who we are and help us change the world into a better place?

We will have to wait and see…

jueves, 11 de abril de 2013

Think your password!

Do you know this situation...You are at the ATM on a crowded street and you are afraid, that someone can see you typing your PIN?

This can be history in a couple of years! In the future you can just THINK your password - and with the help of a common EEG headset and a piece of software you will get access to your bank account, mail account etc.

Professor John Chuang (Source: http://www.ischool.berkeley.edu)


A team from UC Berkeley School of Information released an interesting paper about this. Within a defined threshold, they were ably to reduce the error rate to 1%. This is pretty amazing.

So when we project this research in the future, where EEG sensors are much smaller and integrated into our daily life, this can be a big benefit regarding convenience and security while authentication processes.

But together with every invention in the area of security, there is also a way to manipulate it. You will find some interesting points about "hacking your mind" in the next days here.

Stay tuned!

martes, 2 de abril de 2013

PUI - a new gateway to control your computer!


Perceptual User Interface (PUI) is one kind of technology that allows user to interact with computer without using other forms of aids such as keyboard and mouse. It is extremely useful for people have physical difficulties to use computers and it has become a very popular subject in computing department in most of universities.

When I was doing my undergraduate, I have studied a course called computer vision. Computer vision is a field that includes methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information[1].

During the course, we were required to develop a graphical user interface that would allows user interact with the system by pointing with their finger.  Basically, we set up a camera and record the movement of our fingers, the camera was connecting with computer so it was sending the live video to the computer at the same time. What the program does is tracing the tip of the finger (please see the picture below, there is a red arrow at the tip of the finger on the right window) when the finger is moving in the video simultaneously. This program is very helpful for disabled people that cannot use the keyboard or mouse. For example, if a disabled person wants to tell computer to open a website, he/she can just by pointing his/her finger to a specific point which we can pre-define that point is ‘internet’ browser point, so the computer can open web browser automatically.  It does sounds similar to iphone, but the difference is to use Iphone’s touch screen, the user will need to touch the screen physically. However, the program I developed does not even need the user to touch the screen at all. This is much cooler as you can control your computer remotely without any physical touch!






[1] http://en.wikipedia.org/wiki/Computer_vision

lunes, 11 de marzo de 2013

We found a very interesting article about brain computer interfaces. But in this case it is applied to medicine. We believe this is the future for patients that are quadriplegic. Brown University researchers created an implanted brain-computer interface. Although it has only been implanted in pigs and monkeys without problems, scientists are positive that human implants are around the corner.

The wireless BCI allows the host to move freely and collects data, allowing scientists to analyze the brain activity when the subject is involved in a complex activity such as a social interaction rather than the movements of its limbs.

With a very powerful and efficient use of energy, this device is the size of a pacemaker. It can extract high quality rich neural signals and will be a huge help for human neuroscience once it is approved.



Brown researches were tasked to safely test the device on humans, without doing the implant of course. The result a test was very successful, were a quadriplegic woman is able to moves a robotic hand with her mind.

This breakthrough is extraordinary and a huge step for BCI's. When this technology is approved millions of patients suffering from various paralyzing diseases will be able to have a better lifestyle.

Scientists will be able to study the behavior of animals with Parkinson's disease, getting a better understanding and knowledge, hoping to find a cure in a near future.




jueves, 7 de marzo de 2013

Our topic is...Human-Computer Interfaces!

Telepathic rats...tweeting from your mind...playing music based on your mood...all these things are possible thanks to human-computer interfaces (or in the case of the rats, rat-computer interfaces!). The way this technology works by harnessing the signals your body emits, turning these into data. It comes from the health industry, where it's been used for over 20 years to diagnose brain and heart conditions.

But now, smart people and smart companies are realising that this data can be repurposed for other, exciting business purposes. The three main trends we are seeing include the analyses of this data for market research purposes, the use of this data to control applications and the use of this data to enhance communications.

From a market research point of view, it's all about predictive analysis. What companies want to do, and what some are doing, is looking at what people do when they are thinking and feeling a certain way. For example, when I am sad, I want to eat indian food and watch sci-fi movies. So, if a company know that, and they know I am sad, they can send me marketing messages about indian restaurants and sci fi movies. It's like behavioural targeting 3.0 (since we are still on 2.0, right?).

From an applications point of view, a plethora of gadgets have flooded the market over the last year as the technology has become cheaper and more affordable. These include things like mechanical tails that wag with your mood and games that you control with your mind (like blowing up zombies heads). There has also been a headset just launched that plays music based on your mood. No more sifting through your library to find "Killing in the name" by Rage Against the Machine after someone cuts you off in a queue.



And from a communications point of view, if rats can talk to each other using just their minds, why can't we? So, applications that let you tweet from your mind have been developed. And there are others that have been developed for disabled people who can't talk, with the idea being to turn their thoughts into words.

Neuroscience and the brain are at the centre of human-computer interfaces. But new breeds are popping up, most interestingly heart-computer interfaces. Some people believe the heart is the intellectual centre of the body, so it will be interesting to see how the data from our heart's differs to that of the brain and therefore what different kinds of business applications can be developed!

So, this is what we will be exploring over the next 15 sessions :-)