Barcelona
Spain
Spain
About B-MINCOME
‘B-MINCOME - Combining guaranteed minimum income and active social policies in deprived urban areas’ was a pilot project which aimed to fight poverty and social exclusion. It was implemented by the city of Barcelona and its partners between November 2016 and October 2019. After the project’s conclusion, efforts to collect and combine the results of all the conducted research activities were ongoing until the beginning of 2021.
B-MINCOME can be seen both as a public policy intervention and as an experiment testing a specific public policy. As an intervention, it tried to improve households' socioeconomic situations and increase their economic independence. This was expected to lead to positive effects in various dimensions of beneficiaries’ wellbeing. To achieve its policy objectives, the Barcelona City Council introduced the Guaranteed Minimum Income (GMI) to supplement the income of households living in the ten poorest areas of the city.
The GMI scheme was combined in various ways with four active policies devoted to:
- Training and employment;
- Social economy;
- Help with renting out rooms;
- Fostering community participation (in some cases, receiving the GMI was conditional on participation in a specific policy).
The amount of GMI was also divided into low, medium and high transfers, and it could change during the project.
Participation in the intervention was voluntary, but inclusion was based on eligibility criteria including prior engagement with social services in Barcelona, a means test and residency in one of the designated areas. Among the eligible beneficiaries, assignment to a particular GMI scheme was achieved through stratified randomisation.
Some eligible households were assigned to a control and reserve group (i.e. they did not receive the intervention). This is because, while being a public policy intervention, the project was also designed as a social experiment to test the hypotheses that partners had developed on potential results and impact. An elaborate evaluation design was developed and implemented to understand both whether and how the intervention worked.
“The most important issue about the B-MINCOME project is that really it was a project of evaluation of public policies […] It was not a project about innovation itself, but about evaluation of innovation taking several dimensions.” (Source: B-MINCOME project hearing)
‘B-MINCOME - Combining guaranteed minimum income and active social policies in deprived urban areas’ was a pilot project which aimed to fight poverty and social exclusion. It was implemented by the city of Barcelona and its partners between November 2016 and October 2019. After the project’s conclusion, efforts to collect and combine the results of all the conducted research activities were ongoing until the beginning of 2021.
B-MINCOME can be seen both as a public policy intervention and as an experiment testing a specific public policy. As an intervention, it tried to improve households' socioeconomic situations and increase their economic independence. This was expected to lead to positive effects in various dimensions of beneficiaries’ wellbeing. To achieve its policy objectives, the Barcelona City Council introduced the Guaranteed Minimum Income (GMI) to supplement the income of households living in the ten poorest areas of the city.
The GMI scheme was combined in various ways with four active policies devoted to:
- Training and employment;
- Social economy;
- Help with renting out rooms;
- Fostering community participation (in some cases, receiving the GMI was conditional on participation in a specific policy).
The amount of GMI was also divided into low, medium and high transfers, and it could change during the project.
Participation in the intervention was voluntary, but inclusion was based on eligibility criteria including prior engagement with social services in Barcelona, a means test and residency in one of the designated areas. Among the eligible beneficiaries, assignment to a particular GMI scheme was achieved through stratified randomisation.
Some eligible households were assigned to a control and reserve group (i.e. they did not receive the intervention). This is because, while being a public policy intervention, the project was also designed as a social experiment to test the hypotheses that partners had developed on potential results and impact. An elaborate evaluation design was developed and implemented to understand both whether and how the intervention worked.
“The most important issue about the B-MINCOME project is that really it was a project of evaluation of public policies […] It was not a project about innovation itself, but about evaluation of innovation taking several dimensions.” (Source: B-MINCOME project hearing)
This is a case study as part of an UIA report. You can access all of the project's resources on its project collection page.
Evaluation governance
The project was implemented by the Barcelona City Council together with five delivery partners. Because of the project’s nature as both a public policy intervention and evaluation, almost all partners were involved in evaluation to some extent:
- The Catalan Institute of Public Policy Evaluation (Ivalua) conducted:
- the quantitative impact evaluation;
- the quantitative economic evaluation.
- The Young Foundation carried out the ethnographic study.
- The Institute of Governance and Public Policies of the Autonomous University of Barcelona (IGOP) conducted:
- the analysis of the deployment and effects of the community participation policy;
- the study on the implementation process and governance of the project.
- The Institute of Environmental Science and Technology of the Autonomous University of Barcelona (ICTA) was involved in evaluating the determinants of subjective wellbeing among participants.
- The Institut Internacional per l'Acció Noviolenta (Novact) was in charge of carrying out the evaluation of the implementation of the Real Economy Currency (REC).
The project partners represented different professions and areas of expertise, which provided access to diverse knowledge and enabled synergies.
“I think that the mix between university or researchers from the sociological field et cetera mixed with civil servants, with people in the field, collaborating, it was very interesting because the knowledge that we have developed is very different than if we have put the focus only in one part – the academic part or the policy part.”
(Source: B-MINCOME project hearing)
In particular, B-MINCOME benefited from strong expertise in research methodologies. The project’s access to extensive quantitative expertise allowed it to function both as a public policy intervention and an experiment, enabling strong causal inferences in relation to the intervention’s effectiveness and impact. The B-MINCOME evaluation also represented a complex effort to involve actors beyond project partners (see more under ‘Horizontal issues’).
Co-creation and close cooperation between all project partners were key for the evaluation. The dual nature of the project made it particularly important to find a common design and mutual understanding from the start. Since both public policies and experiments face important limitations, the partners needed to be aware of and account for them during design and implementation. As the evaluators noticed, the partners had to compromise on various ideas (such as the need for eligibility criteria, which were mandatory in the implementation of public policy and inevitably influenced the shape of the respondent sample for the experiment) to get to their end goal and to be at least comfortable with what they were trying to achieve.
“So, from the very start of the project, there were these challenges and they had to be solved through co-creation […] Because, of course, we will only be able to evaluate whatever the City Council will design. And to evaluate in a rigorous way, it should be co-created such that they took into consideration the evaluation point of view and we took into consideration what they wanted to do. We didn’t want to answer questions that were not being asked in the first place." (Source: B-MINCOME project hearing)
B-MINCOME secured substantial resources within the project budget for evaluation, although evaluators noted during the hearing that resources still posed some challenges. The counterfactual experimental design chosen as one of the approaches involved a large research effort. Over 1,400 households were surveyed repeatedly (approximately 1.000 treatment households and over 400 control households, in three waves). While the counterfactual evaluation is often more costly than other approaches, in B-MINCOME it was further supplemented with an ethnographic study and other qualitative evaluations pertaining to different elements of the intervention. Additional funds were also mobilised for integrating all research results. These came from the municipal budget independently of UIA financing.
Evaluation process
As a social experiment, B-MINCOME represents a comprehensive effort to evaluate the intervention as a whole and some of its specific parts (e.g. community participation policies, governance or the REC). Major emphasis was placed on evaluating the impact of the intervention at three levels – individual, community and institutional. Components were also devoted to evaluating efficiency (i.e. economic evaluation) and processes (i.e. governance and implementation evaluation). The table below gathers all the elements of the B-MINCOME evaluation.
|
Quantitative analysis |
Qualitative analysis |
INDIVIDUAL impacts |
Ivàlua (Impact evaluation) ICTA (Impact on life satisfaction) IGOP (PA4 evaluation) |
IGOP (PA4 evaluation) The Young Foundation (Ethnographic study) Novact (REC impacts) |
COMMUNITY impacts |
Ivàlua (Impact evaluation) ICTA (Impact on life satisfaction) IGOP (PA4 evaluation) Novact (REC impacts) |
IGOP (PA4 evaluation and governance and implementation evaluation) The Young Foundation (Ethnographic study) |
INSTITUTIONAL impacts |
Ivàlua (Economic evaluation) IGOP (PA4 evaluation) Novact (REC impacts) |
IGOP (PA4 evaluation and governance and implementation evaluation) |
Source: IGOP, excerpt from report on the final results (2017–2019), ‘Integration of evaluation results’, shared in October 2020.
While parts of the design were included in the project application presented to UIA, the development of the final approach took a substantial amount of time. The intervention therefore significantly benefited from the addition of a preparatory phase to all UIA projects starting from Call 2.
“Of course, after the submission was sent and approved, some of the nitty gritty of the evaluation design was still to be defined. So this was basically, let’s say, 7 months of intense work and negotiations between partners to reach a system that would answer the questions that were of interest to the project.”
(Source: B-MINCOME project hearing)
Specifically for impact and economic evaluation, Ivàlua employed the counterfactual approach (i.e. measurement relative to what would have happened had the project never taken place). Towards this end, Ivàlua divided households eligible for participation in the project into two groups – those who received the intervention in various forms (the treatment group) and those who did not (the control group).
The partners chose this approach as it corresponded with the circumstances of the project. The demand for services exceeded the funds available for GMI schemes.
“We had more eligible and actually solicitant families than the amount we actually could transfer. This created a natural excess of demand […] We thought the most natural way to answer this question given the context of excessive demand was using random allocation of the treatment.”
(Source: B-MINCOME project hearing)
The counterfactual method used in B-MINCOME was based on an experimental design which meant that eligible households were randomly assigned – by way of a lottery – either to the treatment or control group. The randomisation was stratified, with eligible participants divided into specific categories called strata and, within those categories, randomly assigned to the treatment or control group. The blocking variables for the randomization were a) eligibility for the rent room promotion policy, b) employability of at least one household member (yes / no) and c) the expected amount of the monthly GMI benefit for the household (high, medium, low).
For economic evaluation, Ivàlua employed among others the cost-effectiveness analyses, focusing on the evaluation of the impact of the program on selected results (general satisfaction, social support, health status, employment situation and education) and the use of resources.
For the ICTA study evaluating the determinants of subjective wellbeing among the participants, cross-sectional regressions were conducted. ICTA also carried out a study of panel data which explored individual and collective changes in life satisfaction over time and sought to explain the variations in the levels of life satisfaction of each individual.
As part of the qualitative research effort, the Young Foundation conducted an ethnographic study, which also helped to determine what should be measured in quantitative surveys.
“Separating the ethnographic research from the quantitative analysis was important for us because this could inform part of the survey. So, what should we ask? What was not necessary to ask? What were the common problems for these families?”
(Source: B-MINCOME project hearing)
The ethnographic study consisted of three components:
- A study of the experiences of a select number of households over 12 months when beneficiaries received GMI. The report from the study was published in February 2020.
- A community-level study exploring the dynamics of poverty which resulted in a separate report.
- A study on the perceptions of individuals experiencing poverty. The result of this was 8 videos presenting beneficiaries’ stories of change, struggle and resilience, and how B-MINCOME had affected their lives. A film screening was hosted in the neighbourhood of Bon Pastor in November 2019 and the films are also available online.
Counterfactual approach
Text
The counterfactual is a hypothetical situation describing “what would have happened had the project never taken place or what otherwise would have been true. For example, if a recent graduate of a labor training program becomes employed, is it a direct result of the program or would that individual have found work anyway? To determine the counterfactual, it is necessary to net out the effect of the interventions from other factors—a somewhat complex task.”
Source: Baker, J.L., Evaluating the Impact of Development Projects on Poverty. A Handbook for Practitioners, The World Bank, 2001.
The counterfactual is a hypothetical situation describing “what would have happened had the project never taken place or what otherwise would have been true. For example, if a recent graduate of a labor training program becomes employed, is it a direct result of the program or would that individual have found work anyway? To determine the counterfactual, it is necessary to net out the effect of the interventions from other factors—a somewhat complex task.”
Source: Baker, J.L., Evaluating the Impact of Development Projects on Poverty. A Handbook for Practitioners, The World Bank, 2001.
Cost-effectiveness analysis
Text
“Cost-effectiveness analysis (CEA) is a method that can help to ensure efficient use of investment resources in sectors where benefits are difficult to value. (…) CEA can identify the alternative that, for a given output level, minimises the actual value of costs, or, alternatively, for a given cost, maximises the output level. (…) CEA is used when measurement of benefits in monetary terms is impossible, or the information required is difficult to determine or in any other case when any attempt to make a precise monetary measurement of benefits would be tricky or open to considerable dispute.”
Source: European Commission, Evalsed Sourcebook – Method and techniques, 2013, p. 29.
“Cost-effectiveness analysis (CEA) is a method that can help to ensure efficient use of investment resources in sectors where benefits are difficult to value. (…) CEA can identify the alternative that, for a given output level, minimises the actual value of costs, or, alternatively, for a given cost, maximises the output level. (…) CEA is used when measurement of benefits in monetary terms is impossible, or the information required is difficult to determine or in any other case when any attempt to make a precise monetary measurement of benefits would be tricky or open to considerable dispute.”
Source: European Commission, Evalsed Sourcebook – Method and techniques, 2013, p. 29.
The B-MINCOME evaluation employed a mixed-methods approach to data collection and analysis. This offered a combination of strengths from different methods and allowed for mitigation of their shortcomings.
“In some kind of projects, we only focus on qualitative results, but we have not the power of big numbers [...] But, if we go to the other extreme and we have only the quantitative, sometimes it’s very difficult to explain why, what is the reason that explains why these changes have been produced in the people. And you need to search for the explanation [by] asking the people.” (Source: B-MINCOME project hearing)
The evaluation of B-MINCOME involved extensive data collection. From the start, the partners wanted to ensure the availability of appropriate data. The table below summarises all the data collection activities undertaken.
Quantitative research |
Qualitative research |
|
IGOP:
|
IGOP:
|
The Young Foundation:
|
Novact:
|
|
Data collection methods in B-MINCOME.
Source: IGOP, excerpt from the report on the final results (2017–2019), “Integration of evaluation results”, shared in October 2020.
The project was a public policy intervention and a social experiment at the same time. This meant that the households selected to participate in the project would also be participants in the three waves of the survey which fed information into the impact and economic evaluation, as well as the evaluation of the determinants of subjective wellbeing. In other words, the sampling of beneficiaries was also the sampling for the survey. The control group was surveyed as well in all waves. The potential beneficiary sample was determined based on social assistance administrative records. One of the limitations of administrative data is that it lags behind reality and may, consequently, contain outdated entries. There was, therefore, a need for further verification of beneficiaries.
“We were using administrative data to identify families. The problem with administrative data is that sometimes they have a lag between what you know from the data and what is actually happening with the families. We were identifying families as maybe vulnerable that were no longer vulnerable.” (Source: B-MINCOME project hearing)
The City Council identified all the families that were active users of social services from 10 selected neighbourhoods. The initial list included approximately 4,305 households. These people were contacted via letters. Subsequently, up to 400 informative sessions were held to explain the project. Some 2,525 households eventually applied.
After an in-depth audit, as many as 1,527 households were found to be eligible to participate in B-MINCOME. For this group, a lottery was held which assigned the participants to treatment, control and reserve groups. The reserve group was created to make sure that the project could react quickly to non-participation or resignation of beneficiaries from the treatment group.
“Basically, because it is a public project you cannot allocate resources to people that are not eligible […] So, once you allocate families to the treatment group, you will always have a part of these families that will not end up participating […] From the control group you need to set up a reserve group such that you will take from this group and substitute those that will not participate in the treatment group.” (Source: B-MINCOME project hearing)
For the baseline measurement, a computer assisted telephone survey was conducted. It had an 87% response rate. The first follow-up survey had a similar structure to the baseline, with selected questions modified or added. Some surveys were conducted in person as a solution to language difficulties experienced by respondents. Additionally, some families that could not be contacted by phone were interviewed by social services in person when they arrived. The response rate in the follow-up survey was 79.49%. The second follow-up survey, with a response rate of 75.72%, was again similar in structure to the first follow-up survey, with some questions added or eliminated. Surveys were filled out by the person receiving the GMI and active policies. This person would respond both to questions about their specific circumstances, but also about their household situation.
Additional surveys were conducted by IGOP for the purpose of evaluating the community participation policies. In implementing the surveys, researchers encountered challenges related to the respondents’ levels of education and, as with the impact evaluation surveys, language. They were, however, able to adapt to the situation.
"We combined a survey that was distributed in hand, which means we went to the different sessions that took place with the families and we shared with them papers. We explained to them the questions because we learnt that just giving them the paper would be too complicated because some of them had troubles with reading and interpreting the questions […] We also realised that we would need to translate the surveys into Urdu.” (Source: B-MINCOME project hearing)
As is visible in the table, this substantial quantitative effort was complemented by qualitative research involving many data collection methods, such as in-depth individual interviews and focus groups. The Young Foundation employed ethnography, adding household observations to in-depth interviews. Importantly, the research had a longitudinal nature, with some households being visited more than once.
With the evaluation involving so many components and conducted throughout the entire project, integration of results became crucial. Even more so, if shortcomings of the quantitative research were indeed to be mitigated by results from qualitative data collection and analysis, and vice versa. It is, therefore, a positive sign that the integration effort was carried out despite the project’s termination by IGOP in 2020.
Mixed-methods approach
Text
Mixed-methods approach can be defined as “research in which the investigator collects and analyzes data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or a program of inquiry”.
Source: Tashakkori, A., Creswell, J.W., “Editorial: the new era of mixed methods”, J Mixed Methods Res 1: 3–7, 2007.
Mixed-methods approach can be defined as “research in which the investigator collects and analyzes data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or a program of inquiry”.
Source: Tashakkori, A., Creswell, J.W., “Editorial: the new era of mixed methods”, J Mixed Methods Res 1: 3–7, 2007.
Baseline measurement
Text
Baseline measurement describes the situation before a given intervention (e.g. a project or programme) begins. Baseline data shows values for indicators selected to measure performance, outcomes and impact of an intervention prior to its initiation. This data can be compared to data gathered throughout the intervention and after its completion to estimate change.
Source: Bamberger, M., Reconstructing Baseline Data for Impact Evaluation and Results Measurement, The World Bank, 2010.
Baseline measurement describes the situation before a given intervention (e.g. a project or programme) begins. Baseline data shows values for indicators selected to measure performance, outcomes and impact of an intervention prior to its initiation. This data can be compared to data gathered throughout the intervention and after its completion to estimate change.
Source: Bamberger, M., Reconstructing Baseline Data for Impact Evaluation and Results Measurement, The World Bank, 2010.
Longitudinal research
Text
“Longitudinal research refers to the analysis of data collected at multiple points in time. (…) in research that uses a longitudinal design a single group of participants is followed and assessed at multiple points of time.”
Source: McKinlay A., “Longitudinal Research” [in:] Goldstein S., Naglieri J.A. (eds) Encyclopedia of Child Behavior and Development, Springer, 2011.
“Longitudinal research refers to the analysis of data collected at multiple points in time. (…) in research that uses a longitudinal design a single group of participants is followed and assessed at multiple points of time.”
Source: McKinlay A., “Longitudinal Research” [in:] Goldstein S., Naglieri J.A. (eds) Encyclopedia of Child Behavior and Development, Springer, 2011.
Horizontal issues
B-MINCOME evaluation involved various stakeholders. Most project partners were included as evaluators in the evaluation design and implementation. Other partners and individuals implementing the project were consulted as sources of information. IGOP organised focus groups with professional representatives of the four active policies and social workers from B-MINCOME, for example. Through in-depth individual interviews, the evaluation sought to incorporate the perspective of other stakeholders, including administration professionals linked to the B-MINCOME project and representatives from the social communities of the Besòs Axis neighbourhoods.
The evaluation also involved the project’s target group – mostly as a source of information, however. Beneficiaries’ participation was crucial for the experiment to produce results, so they were widely consulted through quantitative surveys and qualitative interviews. Ensuring wider beneficiary response to the survey required strong communication, encouragement and support from researchers. Apart from engaging beneficiaries as sources of information, the ethnographic study allowed them to play a more active role in analysis and in assessment of their own experiences. This was also possible through the co-creation of videos summarising beneficiaries’ experiences of poverty and the significant changes they experienced as a result of the B-MINCOME project.
The beneficiaries were not involved in the co-creation of the evaluation due to a number of challenges. Their vulnerabilities, limited experience of participation in research, as well as language barriers coupled with a short project timeframe – in the partners’ view – reduced avenues for co-creation of the evaluation.
“Your question about how or to what extent the evaluation was co-created – in this context, it was challenging because the families were from the most vulnerable groups. There were maybe issues regarding language. There were people who had not been involved in any participatory processes or in the community.” (Source: B-MINCOME project hearing)
One of the issues which emerged in B-MINCOME was how to avoid placing too much of a research burden on project beneficiaries. During the evaluation, beneficiaries were asked to fill out repeated surveys for the sake of the research, participate in interviews and put up with observations. This could have created an imbalance in relations between them and the researchers, creating a feeling that the beneficiaries were indeed just objects for various actions from project partners.
“So, we had this concern how to approach them and not look like they were in a laboratory and we were just watching them. But it was also a challenge to design something that would not require too much commitment.” (Source: B-MINCOME project hearing)
The researchers were aware of this danger and introduced mitigating strategies which involved increased researcher participation in activities with beneficiaries in the initial months of project implementation, coupled with extensive communication to justify the research activities.
“Just being there, looking at what was going on, taking notes and that’s all, but at least they saw us not only as researchers that were in the laboratory, but as if we were persons, that we were following up what was going on there. We also tried to explain that we needed their input.” (Source: B-MINCOME project hearing)
Lessons learnt
The project clearly represents a very strong evaluation design which aims to provide a comprehensive picture of the intervention, both in terms of its impact – at the individual, community and institutional levels – and processes. It goes far beyond monitoring. The evaluation made use of the counterfactual approach, and combined it with qualitative studies, including ethnography. A plethora of stakeholders were consulted with the application of various data collection methods. Different types of analysis were applied to the obtained data. Many partners collaborated on the evaluation not only to determine whether the action worked, but also to answer questions about how and why.
Some observations can be made in relation to transferability of the evaluation design. The availability of adequate research expertise would be an important consideration in determining whether the approach can be followed elsewhere. For ethnographic research, one of the main considerations would be the language of communication, especially for interventions involving migrants and vulnerable populations with lower education levels. Different linguistic competencies may therefore be needed in the research team. With proper qualitative expertise, however, ethnographic research as implemented in the project can be transferred onto many other types of interventions.
The transferability of the experimental design seems to be a more complicated question, however. The feasibility of an experiment relies on the availability of a control group that does not receive the intervention’s support. The creation of a control group may be controversial from an ethical or political standpoint, but it may just as well – in some circumstances – be impossible. Further still, the counterfactual approach requires repeated data collection – mostly quantitative and usually wider than in qualitative studies – which creates the need for higher budgets. In this sense, the full design may not be transferable in the case of all interventions.
The research approaches and methodologies that partners used drew from the well-known research toolbox. The innovation that the project offers, therefore, lies in combining and linking all these different elements and applying this integrated result onto an intervention, such as B-MINCOME.
“If we define innovative in the sense that it was new, never done before – obviously not, because we were taking from other experiences, evaluation and monitoring. […] But […] most of the projects do not have a control group or lack data. And I think that this project was innovative in the sense that it took a lot of effort to guarantee that that was there, and that if we want to evaluate the effect of the project, we will have the data.” (Source: B-MINCOME project hearing)
A number of specific lessons can be drawn from B-MINCOME for future projects:
à The counterfactual approach can work in projects tackling sensitive social issues, if it corresponds with the conditions that the project creates, is well-integrated into an intervention from the start and is well-communicated. Interventions that concern sensitive social issues, especially those involving vulnerable populations, often cannot be evaluated through a counterfactual. However, as the B-MINCOME example shows, the approach can be used – for example, when the project meets a situation of excessive demand which would have created a sort of control group anyway. It also helps when the partners select the approach early on and can take several months to fully develop the evaluation before implementation starts. Such a preparatory phase allows the design to come to fruition through a cooperative process and helps to align the intervention with research requirements (and vice versa).
“What you think about at the beginning, from the top and far away is not really the same when you are there, meeting the people themselves.”(Source: B-MINCOME project hearing)
- Tensions may occur between policy implementation and research, but these should be worked out rather than feared. The project’s dual nature created some challenges (related to the official eligibility criteria influencing the sample and the need to ensure consistent participation and high response rates, for example). In both cases, internal and external communication and cooperation proved to be effective in leading partners to working solutions. The preparatory phase allowed the partners to work out the most important project details from the start.
“All the time, we faced the tension between the research and the evaluation, and the structural policies, infrastructure and human resources. All the time, we were managing as we could this tension between the structure, and the innovative and evaluation dimension.” (Source: B-MINCOME project hearing)
- The capacity of the target group (i.e. survey and interview respondents) should be analysed and solutions should be developed to widen participation. In interventions which tackle sensitive social issues, such as B-MINCOME, the evaluators should have enough understanding of the target group to be able to set up an evaluation design that also allows vulnerable individuals to participate. The challenges could include low literacy levels among the target group or language barriers. In these cases, more resources can be allocated (for the facilitation, interpretation, and translation of surveys, for example). Different modes of survey implementation can also be employed.
- Big research efforts run the risk of overburdening respondents with demands. Given the intensity of research, evaluations such as B-MINCOME should have mechanisms which force evaluators to examine their own demands towards research subjects. There should be moments for reflection on whether the respondents are asked for too much, and whether they are treated with respect and not just as sources of data. The researchers should build trust through their own participation in activities, putting themselves forward and providing the rationale for their own actions.
- There is great value in involving specialised research institutions with both qualitative and quantitative expertise. Thanks to the advanced expertise on board with the project, B-MINCOME was able to carry out extensive data collection and advanced analyses within the challenging context of a big and innovative project.
About this resource
#SCEWC24 treasure hunt:
Reach the next level --> explore this page and find the button "Climate Adaptation", hidden in the "Green" part.
Then, you have to find an "Urban practice" located in Paris.
The Urban Innovative Actions (UIA) is a European Union initiative that provided funding to urban areas across Europe to test new and unproven solutions to urban challenges. The initiative had a total ERDF budget of €372 million for 2014-2020.
Want to replicate this urban practice in your city?
Apply to an EUI City-to-City Exchange
Connect with a peer city who can bring you solutions and expertise and apply together to receive EUI support
More infos on EUI websiteBrowse existing Innovative Actions looking for Transfer Partners and cities willing to do a City-to-City Exchange looking for peers