This is an ongoing research study in collaboration with Cornell Tech and funded by the NSF.
The goal of the project is to investigate user perceptions of reductions in the dimensions of quality, latency, and availability of common applications as a method of decreasing end-use energy consumption (i.e., carbon footprint). As the lead UX researcher, I was responsible for strategizing how we might measure the likelihood of user acceptance of such technologies. Ultimately a mixed methods approach- thematic qualitative analysis paired with quantitative statistical analysis-revealed insights that indicate users may be willing to trade latency for carbon footprint reductions.
As the lead UX researcher, I lead weekly meetings with a team comprised of professors from Cornell Tech and Georgia Tech. I completed the IRB protocol and created all survey items, as well as completed all data analysis and mockups.
As our research is ongoing, our solution is still in the initial phase of sketching/brainstorming. The current example user interface empowering users to modify computing applications based on carbon footprint. This could potentially be used to expose the tradeoff in large-scale computing services and carbon cost of using the service.
The pilot survey (N=30), which compared usage patterns of 13 common applications, revealed that two commonly used applications are Google search engines (83.3% of users reporting use at least once a day) and social media (63.3% of users reporting use at least once a day).


These applications became the subject of questions formed for the primary survey, which examined the likelihood that users would accept a trade-off of the quality, latency, or availability of a given application for a decrease in personal carbon footprint.
These dimensions directly relate to the hardware resources (e.g., computing, memory, and storage demands) and systems software (e.g., task scheduling, VM orchestration) needed to service large-scale ICT, as well as the energy consumed by deploying software services. the components of system design that impact user experience [1, 2, 12, 14].
A Wilcoxon Signed Rank Test was performed to determine if the mean ranking between Likert scale questions differed. Qualitative en-vivo coding was used to analyze open-ended participant responses and gather insight into reasoning behind user preferences.
Three significant relationships were revealed:
Google Quality Tradeoff x Google Latency Tradeoff
Google Availability Tradeoff x Google Latency Tradeoff
Social Media Availability Tradeoff x Google Availability Tradeoff
Through the mixed methods study described above- thematic qualitative analysis paired with quantitative statistical analysis- the following insights were derived.

"Incredibly inconvenient and not worth the trade-off unless it was a MASSIVE difference" (P20)
"As a student, I cannot afford having limits set on the times I can use google"(P7).


"Does this correspond to money or my direct benefit in any way? If you make the tie between energy use and money direct and explicit, people will react to those incentives without your sanctimonious preaching" (P16).


"this trade off would not be much effort or difference so I would probably participate"(P20).
"I would not mind slower-running products and services if I knew they were conserving energy overall."(P17).


"I don’t know if energy is the deciding factor here. I think I use social media too much, so it would be helpful to limit the time I spend on social media" (P11).
Availability tradeoff for social media was largely accepted. (sig relationship with Google tradeoff acceptance)
Preliminary design concepts
Additional survey formation to collect more reliable data
Weekly meetings
Patiently waiting to hear status of CACM paper submission
As of now, we are left with more questions to seeks answers for in the next phase.
• How do we define the thresholds in which users will accept changes in quality, latency, and availability?
• How can we convey the carbon cost of various ICT services in a meaningful way to the target user?
A strange finding to further explore:
Eco-mindedness was not found to correlate with likelihood ratings. This could be due to the phrasing of our questions, equating CO2 emissions saved to trees saved per year. It is possible users cannot see the tangibility of the trade-off in the way we presented it. Additionally, this finding could suggest that a user’s self-identified level of "eco-mindedness" does not influence ICT decisions in the same way that it may influence more material consumer decisions.
The IRB at Georgia Tech proved to be very time consuming and limited early phases of the project. I look forward to working in industry soon where IRB protocols are a thing of the past!
Additionally, leading a group that spans across multiple states was also challenging at times. This may also influence my choice between remote and in person work come May.