top of page
Feminist Law

Podcast Episode 10: Menstruapps and Data Protection with Dr. Daniela Alaatinoglu

CJ: Hello and welcome to the Feminist Law Podcast. I’m your co-host Courtney Jones.


CT: And I’m your co-host Clara Topiol. We’re both co-founders of the Feminist Law Project and final year law students who are very passionate about feminism and the law. Today on the podcast, we are very lucky to have Daniela Alaatinoglu, an Icelandic research post-doctoral researcher at the Faculty of Law at the University of Iceland and Senior Researcher at the University of Turku. Daniela, would you kindly introduce yourself?


DA: So, thank you very much for having me. You already mentioned my name, so I think it’s enough to say enough now that I am researcher and a teacher in law who likes to think critically about how the law regulates our lives and also or perhaps particularly, who it excludes and includes. My work is mainly within the fields of socio-legal studies, law, gender and sexuality, comparative public law, human rights, criminal law, and to some extent also law and technology.


CJ: Thank you for that introduction. So, we’ve read your recent article titled ‘Rethinking explicit consent and intimate data: the case of menstruapps’. We found it really fascinating and eye-opening, especially in the digitalized era we’re in currently. What made you decide to write this article and why do you think it was important to have this information published?


DA: So, something that really opened my eyes to the subject, in general, the age that we live in now is very much one of surveillance, the quantified self. We see how the sort of technologies constantly developing and increasingly collecting and sharing data about our lives. For example, by measuring and evaluating our bodies and physical fitness and health. I think that menstruapps or period tracking software applications is just one example of such technology. I think, when it comes to these health apps and menstruapps in particular, it is quite important ot highlight that data is shared or processed both in the forms that we quote unquote agree to – we will of course talk more about this agreement and consent – and also in forms that we have not agreed to through leaking data more specifically. I think both of those forms are ways that our data is being shared with others. Then, I thought it’s quite an important social justice concern that in order to access our digital rights or online services, we have to quote unquote agree or consent, again, you don’t really have much of a choice in order to basically be able to use something online. Something that made me really interested in menstruapps is that many of them in their marketing campaigns use very feminist language or emancipatory language when talking about taking control over one’s body, placed at an interesting intersection between these feminist lingues surrounding period and sexual reproductive health matters in general. Also, these constant neo-liberal search for new markets and that is an unsettling combination as a feminist to witness. Then, I noticed that there were some problematization of the app/tech industry and health apps and other kinds of software apps. This problematization was being done quite largely I would say from a human rights perspective, so we have for example coding rights, they have a feminist focus as well. They wrote a report about this in 2016. Other organisations that have problematised the app/tech industry are for example Privacy international, Amnesty international, the Norwegian consumer council about how the apps are processing and sharing and leaking our data. When it comes to menstruapps in particular, I think it’s interesting because they gather very very sensitive data in the perspective of many and it’s not just about periods. They also ask users to fill in information about sexual intercourse, about masturbation, orgasms, mood swings and sort of many other categories that you can fill in. So, they gather data which is quite sensitive, and I think for many people, quite private. That of course has an inherent data privacy concern in it, that this kind of data is being gathered. I noticed that whilst there were some organisations that were reporting this, there was scholarly debates about the gender that intersectional implications that data privacy actually has as a fundamental right. Feminist theory of course, in turn, has written a lot about consent, critiqued the concept for quite a long time. I think much of this discussion has been developed in relation to sexual violence which is also a field that I have researched previously. I thought that this theoretical discussion could quite easily be applied to or used when it comes to discussions relating to consent and data privacy. This article and study were an effort to bridge a disciplinary gap that I saw between this discussion on data privacy and consent and then critical analysis of menstrual health tracking but then also feminist legal theory. When it comes to why it was important for me to publish this work, I think, I sort of modestly hoped to develop this to a more scholarly discussion on feminist data ethics and it was important to me also that it is available on Open Access so that it is not just accessible to people in wealthy higher education institutions but also a broader range of people.


CT: Thank you for that, that is absolutely fascinating. You mentioned that for example, for the use of menstruapps or period tracking apps, there was a lot of data that was collected and some of it might not be for the specific purpose of menstruapps. So, in terms of your research, what stood out for you in terms of potential breaches of GDPR or similar regarding these apps?


DA: Yes, thank you. Something I should point out at the very beginning or before answering this question is that, of course, the empirical study that I made for this article was quite a small-scale and non-representative investigation of the formulations of consent in seven popular period-tracking apps. This was made in July 2020 so it might have changed, it probably has since then. First, just to point out that it was small-scale and non-representative in that sense. Something that stood out to me was that all of them, of course, processed sensitive data but not all of the apps even recognised this in their privacy policies. Something else that was interesting was that all of the apps legally based their processing of data on consent according to their privacy policies but only four out of seven apps asked for users’ consent at the outset, so when you download and start using the app. So only four out of seven actually asked you to agree to their privacy policies. The other ones just assumed that you agreed. It was not also very clear in any of the apps what the data subject or the user was consenting to at all. It was also not really clear what data was being processed in the first place and not also clear what kind of data was shared with third parties because many of these apps of course are free, but they make money through selling apps. They share also some data with third parties, but it was not really clear what kind of data was shared and it was also not clear who those third parties were. And I have one example that I can recall here of the description of data that is being processed: ‘may include usage details, metadata and real-time information about the location of your device’. So, first, we can see that this is quite technical data like ‘metadata’ because you need to know what that is and so is ‘real-time information about the location of your device’. ‘Usage details’ doesn’t specify what kind of details so it’s quite vague formulations in these privacy policies, if the data was being specified and often it wasn’t. then, something that was striking was that often users were only given two options: you can agree or not agree so yes or no. But there was no real option to customize privacy policies according to which kind of data you felt comfortable with sharing or not, this wholesale yes or no. It’s quite obvious that these kinds of conditions do not live up to the standards of explicit consent or even standard consent as laid down in the European Union General Data Protection Regulation (GDPR) which you mentioned. So, I think that was maybe quite striking as well, how obvious these violations were.


CJ: Yes, absolutely and that’s really interesting. So, you mentioned a lot about how it’s not really clear how data is being used by these apps and how it’s being sent to third parties and that kind of thing. Given that, how significant do you think the GDPR, or data protection of consumers is with regards to menstruapps? And do you think that it should change?


DA: I think that the GDPR has been symbolically important in many ways. I’m aware that in global perspective, the GDPR might be on paper quite a strong standard of data protection. I think it symbolically has been important to put constraints on the app/tech industry because it’s the whole European Union and not just one country and so on. I would say that yes, symbolically important. But when investigating the privacy policies of some menstruapps, I soon noticed that there was no real or meaningful engagement with the rules set down in GDPR. There was maybe some quite shallow reference to the GDPR but not really any more meaningful engagement or thinking about these concepts such as consent. There were also very limited possibilities for data subjects to decide which data they wanted to share and if they do not want to share, it was not really clear if they could even use the apps. Article 4 of the GDPR states that consent is a freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by statement or a clear affirmative action, signifies agreement to the processing of personal data. So, I think this is quite a high standard and especially, at least on paper, when you compare to the reality or the privacy policies of these apps, it was quite clear that this is not the case. So freely given specific, informed and unambiguous indication of the data subject’s wishes sets out quite a high standard. Something that stood out to me, looking a bit closer at the GDPR was that the GDPR makes a difference between standard consent on the one hand and explicit consent on the other. For special categories of data such as health data and sensitive data, data subjects are required to give explicit consent and that is then regulated in article 9 so it’s not just standard consent but explicit consent. What’s surprising to me is that the GDPR does not specify what this explicit consent standard entails really. So, it doesn’t specify the normative content of it. The European Data Protection Board has developed some guidelines on explicit consent, but they talk mainly about the forms of consent, and it was also clear that perhaps there might be some need to regulate this difference even further.


CT: Thank you for that, that’s really interesting. So, you’ve mentioned obviously that the GDPR has regulations in place to protect consumers’ rights but obviously there are cases with menstruapps where these rights are compromised. Could you talk a little bit more please about the impact of users’ data being compromised through these menstruapps?


DA: Yes, of course. I think something that I would really like to mention here is that, or a few words about the assumed user and the intersections or implications that these apps might have. So, something which is quite general was that the privacy policies of menstruapps were almost always in English and they tend to use quite technical language. When you read the privacy policies, or at least for me it was quite clear that they assumed the user who is fluent in English, probably has English as a native language, who is very familiar with tech jargon and who also takes extraordinary measures if need be. One example of such extraordinary measures was that there was one or two apps who were asking data subjects or users if there was some kind of data that they did not want to share to email the company if they do not agree to this. Assuming that users would actually email the companies. This is quite a fine standard, or the assumed user is considered to be, in the article I use the term ‘techno global’, an informed consumer who is knowledgably navigating this field whilst the reality might be very very different. I think, from an intersectional point of view, this is very interesting and the field that I am touching upon in this study should be investigated even further which is the different intersectional impacts which protection might have. Some people who have limited access to professional sexual or reproductive healthcare services might for example be more likely to use these apps. One example here is users who wish to get pregnant and of course infertility treatment is very expensive in many places and self-funded in many countries. So that of course makes some people more vulnerable to this data exploitation by these apps. Some users might also not be very fluent in English or whatever the language the privacy policy is written in, very often English though. They might also not be familiar with tech jargon. In the example I said earlier, I’m not sure how many people know what metadata is. Then of course, another important fact is that some countries have quite limited data protection laws and very little implementation or supervision of these laws that makes people living in different countries differently vulnerable to this kind of data exploitation. I call this situated and embodied vulnerability to data exploitation and call it ‘digital period poverty’ in the article and then I’m drawing on the concept of period poverty which is an existing concept to think about what this kind of poverty could mean in a digital context, which is also something I think is really important.


CJ: Yes, and I suppose as a follow-up, you talked a lot about the implications of these apps for women who are maybe tracking their fertility, trying to get pregnant, maybe wanting to access fertility treatments and become pregnant. Do you think these apps would have implications in countries where abortion is illegal given that there is very limited privacy? Do you think this could pose a threat to women seeking abortions? DA: Definitely. That’s actually something that I have thought about also after writing this article. The privacy safety concerns are just part of the safety concerns in my path when it comes to using these apps. In countries where safe abortions are criminalised and illegalized, another safety concern is of course whether you might risk criminal sanctions if the app data is shared with law enforcement authorities. I think in the US recently, there was discussion about this topic in the aftermath of the Supreme Court decision Dobbs v Jackson Women’s Health 2022. I think that this is definitely another dimension of the safety concerns that these apps might brace.


CJ: Yes absolutely. It’s definitely concerning to think about how women can be put at risk because their privacy isn’t being protected by these apps. So, when users are using and installing these apps, they generally have to consent to some use of their data. What are we actually consenting to when we click that ‘agree’ button?

DA: Very good question. Based on this study, I think the best answer is really we don’t know. After investigating the privacy policies of the different apps, I think it’s just quite unclear what we’re actually consenting to, how the data is being processed, shared and with whom so really, there are better and worse apps in this context that have clearer and more unclear privacy policies. In many cases, we do not really know, unfortunately.


CT: That is quite a terrifying thought, to be completely honest given how many people rely on these apps whether that is to track their menstrual cycle in relation to ovulation windows etc… so in light of that, how would you recommend that users of these apps use them in a safer way if there is a safer way at all?


DA: First, I’m quite aware, as you also mentioned, that these apps are quite popular and that some users really like the sense of control that getting information about one’s body and sexual and reproductive health might give you. I think, of course, that it is important to address the gender data gap that exists and to take sexual and reproductive health seriously. I also think these are serious concerns. And of course, my purpose with this study and this article is not to say that these users are wrong or stupid or so but the privacy safety concerns are mainly then about which companies get your information, target their ads, and based on your reproductive health situation, I think one important example of this used many times is if the app assumes or knows that the user is pregnant, either because the user specifically gave this information to the app or it just assumes because of the menstrual cycle and some irregularities. That user then starts getting ads for pregnancies and baby-related products because that is of course a huge market. That’s one privacy concern but then as we also mentioned, another element of this is of course context where safe abortions are criminalised and illegalized and what risks that might bring. I think that there might be some individual strategies of using safer apps; some apps have better privacy policies than others. Also, I do not want to individualise too much and leave all the responsibility to individual users or consumers because some individuals will always be more aware and have better language skills and more legal technical knowledge than others. I think the point of this study and this article is not so much about whether menstruapps are bad or whether we should abandon them but that we need better regulation and supervision of the app/tech industry and we need regulation that takes into consideration the actual possibilities that we have to understand, to negotiate, to consent to sharing our intimate data. Perhaps, this could be something that goes beyond the study, but perhaps that is simply to prohibit some kind of data processing and my wish with this is to put the focus on the data controllers rather than the data subjects or the users.


CJ: Yes absolutely, it’s really important that the responsibility doesn’t lie completely with each individual user because as you mentioned, that’s not a good method of protecting privacy when using these apps. If somebody is concerned about their privacy when using menstruapps, are there any alternatives that you would recommend to using a period-tracker app?


DA: I think if you really really want to use an app, there are options which are designed with data privacy in mind. One example, and I’m not funded by them, but one example that I found that I think should be quite respectful of data privacies is Bloody Hell. They are not creating revenue. Of course, period-tracking and if one wants to track their period, it’s not new as such, it hasn’t come with information technology and is actually a practice that predates modernity so can simply be done with a pen and paper if one wants to.


CT: Thank you for that. So, instead of banning menstruapps altogether, in your article, you suggest reiminaging consent in a scenario where how relations are considered and you build on the feminist concept of freedom to negotiate. Could you please expand on this?


DA: Yes. As you said, using the concept of freedom to negotiate, it’s a concept I have borrowed primarily as developed by and used by Anya Palmer who has used it in relation to sexual violence and rape. I then tried to develop it in relation to violations of data protection. That’s a concept that takes into consideration power, as you rightly mentioned and also the context and the communication between the parties. In this case, between the data controller and the data subject, so the provider of the app and the user of the app. I think the main point, as I also said earlier about all these studies, I try to imagine how we can make legal standards more respectful for our desires, as state subjects. I think that the concept of freedom to negotiate allows for some more imagination. I think it’s more sensitive to the communication that happens between data controllers and data subjects or in many cases, the lack of communication and the way that these apps are marketed to us. My wish is also to use freedom to negotiate to develop an understanding that could be more inclusive of a range of data subjects who access these apps in very different contexts and under very different circumstances, to be more sensitive to these different vulnerabilities and not always build on the techno global users who are very aware and navigate this area with ease.


CJ: Yes, that’s really interesting. Thank you for sharing that. Before we conclude today’s episode, is there anything you’d like to share with our listeners?


DA: Yes, if you don’t mind, I’d actually like to read a small paragraph from the article which I build on Young’s book from 2005 ‘On female body’s experience’. She was of course not writing about menstruapps at the time but she talked about the paradox between the emancipation that we get through knowledge and control then in order to suppress the female or gender body and I think that this is quite evident when it comes to menstruapps so I will read a paragraph if that’s ok with you.


CJ: Yes, please do.


DA: Okay. ‘The oxymoron present in the promise of emancipation through detailed observation, knowledge and mastery of the reproductive and sexual body is an interesting trait of menstruapps. For a minute observation, tracking and reporting the menstrual cycle users ultimately gain emancipatory sexual and reproductive knowledge, control of their bodies. Simultaneously, the presumed purpose of such managerial skills is paradoxically imitating a non-bleeding presumed male or invisibly gendered efficient norm, concealing the inefficient bleeding and visible body. This feeds into general split subjectivity of people who menstruate, claiming normalcy on the one hand and fearing the private fluidity of the flesh on the other.’


CT: Thank you so much for sharing that with us and for coming on the podcast. I really hope this encourages some of our listeners to read your work. So, if some of our listeners did want to read your articles or look further into your research about menstruapps or generally, where can they do this?


DA: So, thank you for having me. This conversation was based as I said in the beginning, but I can repeat it here on my article ‘Rethinking explicit consent and data: the case of menstruapps’ which is published in Feminist Legal Studies in 2022. It is also available on Open Access. So, I recommend any interested listeners to check that out, check out the list of references because there are many interesting sources in there. Another very important initiative that I would like to point interested listeners in the direction of is a Feminist Coding Collective based in Brazil called Coding Rights and they were very early to point attention to problems inherent to menstruapps and they don’t just write about menstruapps but also feminist and intersectional approaches to technological developments and focus on social justice and data and I think that’s a very interesting place to look for more information to benefit from related topics.


CJ: Thank you for sharing those resources with us and letting our listeners know how to access your articles and some more feminist legal resources to read. we really appreciate you coming on the podcast today.


DA: Thank you, thank you for having me.


CJ: In today’s news roundup, a nurse formally working at a CBS health court clinic in the United States has sued them, accusing her employer of discriminating against her religious beliefs by firing her for refusing to prescribe contraceptives.


CT: The Metropolitan Police is investigating 1000 domestic abuse claims involving around 800 officers. Amongst others, PC David Carrick has admitted to sexual offences against 12 women after using his role to perpetrate fear into his victims. He has now admitted to 49 offences over two decades. As a result, the Home Office is asking all police forces to vet their existing staff.


CJ: Also in today’s news roundup, the UK government has decided to block a controversial Scottish Bill that would facilitate the process of one changing their legal gender. This is allegedly due to the conflict that would arise with the equality protections in Great Britain if this Bill were to be passed. The Scottish government is expected to challenge the decision.


CT: Also, Iceland captain Silvia Bjorgvinsdottir has won a claim against former side Lyon for failing to pay salary during her pregnancy in a landmark case. In May 2022, Lyon was ordered to pay unpaid salaries in excess of €80,000 (£72,000).


CJ: Finally, Andrew Tate has been detained in Romania on human trafficking charges and will remain there until at least February 27th after Romania decided to extend his detention and he has been found to have send flirtatious messages over Instagram to teenage girls that he had previously met. If you have any suggestions for this podcast, let us know directly via email at contact@feministlaw.org.

CT: Please also visit our website at feministlaw.org and follow us on Instagram and LinkedIn to keep up to date with our latest articles, podcasts, newsletters and exciting news.


CJ: The music from this podcast was sourced from Pixabay.com.


CT: Thanks for listening!


---


Transcribed by: Clara Topiol

Kommentare


bottom of page