Google recently developed a technology that can detect human emotions in a 2 X 2 dimension (energy level and body movement). What are some of the products you can build using that technology? List a few ideas, then choose one and design the product for that idea.
+1 vote
1.6k views
in Product Design by (725 points) | 1.6k views

3 Answers

+6 votes
Google recently developed a technology that can detect human emotions in a 2 X 2 dimension (energy level and body movement). What are some of the products you can build using that technology? List a few ideas, then choose one and design the product for that idea.

Just to reiterate, google developed a technology that can detect human emotions based off of energy level and body movement. I would ask clarifying questions as to how this technology works. Does it require special cameras? Does it require a user to actually turn it on? What’s the level of confidence in the results? Does the user need to opt in to this program? What kind of human emotions are we talking about?

Assuming that the user does need to opt in, but it works off of a normal laptop or phone camera and does need to be turned on, I would start brainstorming ways that this might help solve an existing user issue.

– This app might be useful when it comes to detecting fatigue and general tiredness when working. Google could offer a product that would remind users when to go for some air or take a walk whilst working
– Emotions is useful to gauge how a user feels about any particular product. This would be especially useful for both the user and the app to understand how to optimize UI/UX, product placement, etc.
– Emotions would be useful for Youtube, where it can better hone in on video recommendations than a simple Thumbs up or Thumbs Down.

I want to actually build for number 3, because a better recommendation system would be kind of amazing.

The goal of this feature is to deliver a better recommendation system for users who opt-in to this technology.

The way we know the recommendation is working is by longer viewing times on Youtube, a general happier user from the emotion technology, and better retention.

The problem we are trying to solve for our users is how to elevate more relevant content so that they are able to watch more videos that are akin to their interests. Understand that their interest may change everyday. Some key features for this would be:

– Users would need to opt-in and sync their face and camera up to make sure they work. There will also need to be an opt-in everytime a user surfs over to Youtube.
– A backend database that tracks videos you’ve watched and also the change in emotion from the beginning of the video and through the end
– A front-end display showing the users how their emotions have changed throughout the video itself
– Asking the users how they really felt about the video compared to the emotion captured from the technology
– A algorithm that can match your personal preferences with your emotional response for the next video you want to watch.

Within the algorithm, there’s a lot more features that we can build. Like what does emotions tell us? What does each of the four grids really mean in terms of how they feel about a particular video? I think that’s why we pare it with the actual ratings the user provides to inform us how to move forward.

All of the bullet points would be key to the first launch as it provides us with all the information we need to continue to iterate on the recommendation engine. We will know the recommendation engine is working when we pare a level of internal confidence metric with the user’s actual rating. Also through traditional metrics like how many videos are the users watching one after another.

In summary, I would leverage the emotion engine that Google created to deliver better recommendations for users on Youtube. At first launch, we should take care of all user privacy concerns by making it opt-in and then gather data both through the technology and with user input to help our machine learning algorithm produce better results in the future.
by
+2
Scott – amazing creativity! I was thinking of leveraging a tool like this to help non verbal people communicate. If you have an autistic kid, can you potentially use emotion recognition via facial expression recognition and energy recognition to predict whether they are feeling hungry, bored, tired, happy etc.
You might want to throw in privacy concerns too. There should be a way for someone to turn off the facial/energy detection by voice when they want their privacy protected. Or as soon as someone else comes in to the screen- let the application notify them that they are being watched.
Will you record videos? Will you store them on a server in order to analyze the emotions real time? Talk about how you should delete video/energy recordings as soon as emotion analysis is done unless user chooses to explicitly save the recordings.
+2
That’s a good one. Extending it to help blind people to help understand people expressions when they are conversing with other people in the real world.

Track people reactions for new products in retail.

Monitor health of collaboration in open office.
+1 vote

I would structure my answer as follows:

 

1. Clarifying the question

What could I do if Google had developed a technology that measures energy levels and motion to detect a user’s emotions?

 

2. Brainstorm possible products that could leverage this technology

  • An app that detects your pets emotions so that as a pet owner, you can better understand/care for your pet 

  • An app that detects elderly people who have issues communicating with their loved ones 

  • A tool to help therapists understand how patients, particularly children, really feel about a specific issue during a therapy session 

  • A wellness app that recommends yoga, meditation techniques and diet tips based on users’ current mood/emotions 

 

3. Focus on one idea and develop a product for it 

I am choosing to design a product for the last idea: a wellness app that recommends yoga, meditation techniques and diet tips based on users’ current mood/emotions 

 

This product would be integrated as a new app for an existing wearable product, like Google Watch for example. It would use a number of user data including energy levels and motion to detect the user’s current set of emotions, and which of these emotions is prevalent at a given point. 

I would focus on negative/challenging emotions, since this is the type of situation where a user would want to feel better and improve their mood. 

I’d start with a list of the 6 basic emotions, which could be expanded upon later based on initial data:

  • Feeling sad 

  • Feeling angry  

  • Feeling afraid  

  • Feeling anxious  

  • Feeling fatigued

  • Feeling bored 

 

Some use cases where I envision the app being used:

  • User feeling afraid during a flight 

  • User feeling anxious before a job interview 

  • User feeling sad after receiving difficult news 

  • User feeling angry after a bad day at work 

 

The user would open the app and be presented with an emotional evaluation detailing how the user is feeling, what the prevailing emotions are, and presenting recommendations based on the user’s contextual preferences in that particular moment.

There would be 3 main sections for the user to pick based on their context and goals:

  1. Meditation sessions (specific goals and lengths available based on user’s preferences)  

  2. Yoga sessions (different goals and lengths available based on user’s preferences)  

  3. Diet recommendations (different type of recipes/snack suggestions based on season and time of day)

 

Over time, the app would gather and analyze both qualitative and quantitative data in order to optimize and personalize recommendations for the user:

  • Quantitative feedback: the app itself would measure the effectiveness of these recommendations by tracking any change in the user’s emotions, and present these changes to the user as a feedback loop mechanism  

  • Qualitative feedback: 

    • users can rate/share most useful/effective content for qualitative feedback 

    • User can save most useful/effective content for easy access in the future 

 

4. How would I measure success?

  • Like any app, I would look DAU/WAU/MAU to understand how engaging and effective the app is in driving users to engage with it regularly 

  • To understand the app effectiveness, in addition to ratings and engagement metrics, I would measure how the user’s emotions change after the user engages with the app (E.g. can we see a decrease in negative emotions after the user engaged in a meditation session to manage their fears?)

 

by (19 points)
0 votes

Google recently developed a technology that can detect human emotions in a 2 X 2 dimension (energy level and body movement). What are some of the products you can build using that technology? List a few ideas, then choose one and design the product for that idea.

 
Clarifying questions - 
1) Is the technology an app or can it be integrated with excisting software? I'm assuming it can be integrated with existing products. 
2) Does the technology use a camera in order to detect human emotions? 
 
Before brainstorming some ideas, I'd like to understand the value of the tech: 
- knowing/understanding human emotions can help product owners/companies understand how users are reacting to experiences or suggest relevant ads 
- between customers, the tech would help people communicate honestly 
- in a health product, such as mental health - the tech could help track effectiveness of treatements, therapy sessions 
- for the wearables industry, it could help track workouts and your mental health 
- for users who have a vision imparement, it could help them understand their surroundings better 
 
Ideas of products - 
1) Scenario for medical professionals who conduct therapy sessions - use the tech to deeper understand how a certain medicine/treatement is impacting a patients recovery journey 
2) Scenario for wearables - use the tech to help identify users mental patterns and feed the data into a feedback loop for other integrated products. One use case here would be if there's an indication of low energy levels for the user, suggesting motivation tips or showing an incentive to take a fitness class somewhere 
3) Scenario 2 for wearables (for customers with vision imparements) - similiar to the Bose sunglasses product where there's a speaker in the sunglasses. We can integrate the tech within the sunglasses to help users with a vision imparement understand more contextual clues with people their speaking with 
4) Scenario for parents with young kids - Parents may have a hard time understand their child in various settings. This tech could be surfaced in a way to help provide the parents more info about a specific situation. For example, if their child got caught shoplifting and later in was having a discussion with their parents to help them understand why it happened - parents could get real time details on how their child feels (remorse, revenge etc..) 
5) Scenario for product owners - user sentiment is very valuable when building a great exprience. Being able to gauge that effectively without the users having to fill in a feedback form would be great. One scenario here would be to understand how users feel about ad relevance. Is the ad ranking model working and providing value to cusomers? Are the ads relevant enough? Having the tech as an one of the indicator of success of an ad feature (you can pick any feature, I think it's valuable anywhere) would be a good signal to look at. 
 
Since I'm a product owners, I resonate with the last scenario really well. Let's assume the product I'm working on is ads. The questions I have that the tech will help answer are: 
1) Are ads relevant enough 
2) Are ads annoying the customer 
3) What could we improveme/experient with when it comes to ads? It could be placement, size, messaging 
4) Another strech idea is to input the emotions in customers daily commute. For example, if the tech can track when a user is hungry - suggesting food options on their route home from work would be interesting. 
 
I would first understand how costly it is to integrate the tech with existing search. If it's easily integrateble then I would A/B test it with a small subset of users. There's a privacy component associated with the integration. Will users feel comfortable with Google watching their emotions? We would need to pull together a value prop. Assuming they're comfortable I would define a set of key metrics to track. Since ads is a mature feature, gathering a set of baseline metrics would be easy. I would experiment with different ways to show contextual ads and measure that against how users are feeling. 
 
To summarize, since Google likes to build their products in house. We would test how well the tech does in house before contracting it out to other vendors. I can see many different ways the tech will help other products. 
ago by (54 points)
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
To avoid this verification in future, please log in or register.
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
To avoid this verification in future, please log in or register.

Post your resume


Post homework assignment

About this site

Product Management Exercises is the best place to get help preparing for product manager interview questions. Any member of Product Management Exercises can post product manager interview questions, submit answers to the questions, and give feedback to other members' answers.

Resume Review

Post your product manager resume for feedback and give feedback to other members’ resumes.

Follow PM Exercises