Mohammad Tahaei

Mohammad Tahaei

Senior Research Scientist

Nokia Bell Labs


I advocate for including the human factor in the design of computer and AI technologies. Through the lens of empirical methods, both qualitative and quantitative, I inform the design of future technologies. Since 2018, I have studied privacy and security technologies directed at software developers to evaluate, design, and build tools that can assist software developers in building privacy-friendly and secure systems. Recently, I have started exploring how to make AI technologies that respect individuals, societies, and our future.

  • Human-computer interaction
  • Usable privacy and security
  • Responsible AI
  • Software engineering
  • Ph.D. in Informatics, 2021

    University of Edinburgh

  • M.Sc. in Computer Science, 2017

    University of Bonn

  • M.Sc. in Information Technology, 2013

    University of Tehran

Selected Publications

Privacy, Permissions, and the Health App Ecosystem: A Stack Overflow Exploration

Health data is considered to be sensitive and personal; both governments and software platforms have enacted specific measures to protect it. Consumer apps that collect health data are becoming more popular, but raise new privacy concerns as they collect unnecessary data, share it with third parties, and track users. However, developers of these apps are not necessarily knowingly endangering users’ privacy; some may simply face challenges working with health features.

To scope these challenges, we qualitatively analyzed 269 privacy-related posts on Stack Overflow by developers of health apps for Android- and iOS-based systems. We found that health-specific access control structures (e.g., enhanced requirements for permissions and authentication) underlie several privacy-related challenges developers face. The specific nature of problems often differed between the platforms, for example additional verification steps for Android developers, or confusing feedback about incorrectly formulated permission scopes for iOS. Developers also face problems introduced by third-party libraries. Official documentation plays a key part in understanding privacy requirements, but in some cases, may itself cause confusion.

We discuss implications of our findings and propose ways to improve developers’ experience of working with health-related features—and consequently to improve the privacy of their apps’ end users.

Charting App Developers’ Journey Through Privacy Regulation Features in Ad Networks

Mobile apps enable ad networks to collect and track users. App developers are given “configurations” on these platforms to limit data collection and adhere to privacy regulations; however, the prevalence of apps that violate privacy regulations because of third parties, including ad networks, begs the question of how developers work through these configurations and how easy they are to utilize. We study privacy regulations-related interfaces on three widely used ad networks using two empirical studies, a systematic review and think-aloud sessions with eleven developers, to shed light on how ad networks present privacy regulations and how usable the provided configurations are for developers.

We find that information about privacy regulations is scattered in several pages, buried under multiple layers, and uses terms and language developers do not understand. While ad networks put the burden of complying with the regulations on developers, our participants, on the other hand, see ad networks responsible for ensuring compliance with regulations. To assist developers in building privacy regulations-compliant apps, we suggest dedicating a section to privacy, offering easily accessible configurations (both in graphical and code level), building testing systems for privacy regulations, and creating multimedia materials such as videos to promote privacy values in the ad networks’ documentation.

Understanding Privacy-Related Advice on Stack Overflow

Privacy tasks can be challenging for developers, resulting in privacy frameworks and guidelines from the research community which are designed to assist developers in considering privacy features and applying privacy enhancing technologies in early stages of software development. However, how developers engage with privacy design strategies is not yet well understood. In this work, we look at the types of privacy-related advice developers give each other and how that advice maps to Hoepman’s privacy design strategies.

We qualitatively analyzed 119 privacy-related accepted answers on Stack Overflow from the past five years and extracted 148 pieces of advice from these answers. We find that the advice is mostly around compliance with regulations and ensuring confidentiality with a focus on the inform, hide, control, and minimize of the Hoepman’s privacy design strategies. Other strategies, abstract, separate, enforce, and demonstrate, are rarely advised. Answers often include links to official documentation and online articles, highlighting the value of both official documentation and other informal materials such as blog posts. We make recommendations for promoting the under-stated strategies through tools, and detail the importance of providing better developer support to handle third-party data practices.

Deciding on Personalized Ads: Nudging Developers About User Privacy

Mobile advertising networks present personalized advertisements to developers as a way to increase revenue, these types of ads use data about users to select potentially more relevant content, but the choice framing also impacts developers’ decisions which in turn impacts their users’ privacy. Currently, ad networks provide choices in developer-facing dashboards that control the types of information collected by the ad network as well as how users will be asked for consent. Framing and nudging have been shown to impact users’ choices about privacy, we anticipate that they have a similar impact on choices made by developers. We conducted a survey-based online experiment with 400 participants with experience in mobile app development.

Across six conditions, we varied the choice framing of options around ad personalisation. Participants in the condition where privacy consequences of ads personalisation are highlighted in the options are significantly (11.06 times) more likely to choose non-personalized ads compared to participants in the Control condition with no information about privacy. Participants’ choices of an ad type are driven by impact on revenue, user privacy, and relevance to users. Our findings suggest that developers are impacted by interfaces and need transparent options.

The Developer Factor in Software Privacy

Computer programming operates and controls our personal devices, cars, and infrastructures. These programs are written by software developers who use tools, software development platforms, and online resources to build systems used by billions of people. As we move towards societies that rely on computer programs, the need for private and secure systems increases. Developers, the workforce behind the data economy, impact these systems’ privacy, and consequently, the users and society. Therefore, understanding the developer factor in software privacy provides invaluable inputs to software companies, regulators, and tool builders.

This thesis includes six research papers that look at the developer factor in software privacy. We find that developers impact software privacy and are also influenced by external entities such as tools, platforms, academia, and regulators. For example, changes in regulations create challenges and hurdles for developers, such as creating privacy policies, managing permissions, and keeping user data private and secure. Developers interactions with tools and software development platforms, shape their understanding of what privacy means, such as consent and access control. Presentation of privacy information and options on platforms also heavily impact developers’ decisions for their users’ privacy, and platforms may sometimes nudge developers into sharing more of their users’ data by using design (dark) patterns.

Other places developers learn about privacy include universities, though they may not learn how to include privacy in software. Some organisations are making efforts to champion privacy as a concept inside development teams, and we find that this direction shows promise as it gives developers direct access to a champion who cares about privacy. However, we also find that their organisation or the wider community may not always support these privacy champions. Privacy champions face an uphill battle to counter many of the same privacy misconceptions seen in the general population, such as the `I’ve got nothing to hide’ attitude.

Overall, I find that research in developer-centred privacy is improving and that many of the approaches tried show promise. However, future work is still needed to understand how to best present privacy concepts to developers in ways that support their existing workflows.


In my free time, I enjoy walks in the nature, travel to new places, watch movies, and hang out with family and friends. I spent about four years working as a software engineer in industry before starting my research career (see my CV).

Informal chats: if you’re interested in my research and would like to chat, I’m happy to have a 20-minute video call about anything related to my work. I can often squeeze in a call within 5-7 days of your email. Topics can be but are not limited to, what I do, future research avenues, possible collaborations, my experience in academia/industry, and doing a degree in my research area (I don’t offer any positions). I speak Farsi and English :)