academic partnerships

Workforce Development and Mobile Learning: Our Dakar Survey

code-innovation-young-woman-laptop-computer-dakar-senegal-west-africa-mobile-workforce-development-code-innovation-com.JPG.JPG.jpg

Reports about global poverty often start with grim statistics about youth unemployment. While such statistics routinely fail to capture the mitigating influence of the informal economy, the fact remains that young people in developing countries struggle to find stable employment—let alone employment that actually interests them personally.

For decades, educational institutions have shown themselves rather unimaginative when it comes to workforce development and career education: Career Day, anyone? Most people learn about jobs from their friends, their family members and, if they’re lucky, from their employers. In developing countries, where many young people work (if at all) as petty merchants or manual laborers, the particular culture of the office workplace—as dominated by western-educated management level employees—can seem completely inscrutable, if not downright unwelcoming.

Code Innovation is committed to decoding the norms and expectations of the workplace for the aspiring young would-be-professionals who currently fill the ranks of the world’s unemployed. We are keen to leverage mobile technologies to help prepare young people to surmount the barriers to entry level positions in organizations and enterprises that will allow them to grow and become more prosperous.

For years now, we've been thinking about and working on workforce development with at-risk and low-income youth. A few years ago, we started a youth workforce survey in and around Dakar, Senegal. We had the guidance of a Peace Corps Volunteer who was working with us for the year, and the almost full-time attention of our young Senegalese assistant. From the survey, we learned a great deal about youth, mobile education and workforce development and we are excited to inform our new projects with the perspective that these findings gave us (more on that in the future).

We found that doing firsthand market research in our African, urban context provided rich data for decision-making around our innovations and education work. We hope that others can use this write-up of how we put together our survey useful in their own technology for education work.

About the Survey

We sought to interview 2,000 young people in and around Dakar about their use of technology in preparing for and finding a job. Because of confusion about interview responses that could've been solved with closer supervision, we ended up with 500 interviews in French from university and vocational students, out-of-work youth and entry-level professionals. We surveyed both men and women from around West Africa who are living, working or studying in and around the greater Dakar metropolitan area.

Our research assistant asked each survey respondent 35 questions. Some of the questions were open-ended, but the majority were yes/no answers. Because we hadn't codified "prefer not to answer" and "don't know/haven't thought about it," we found our data wasn't as rich as we had wanted it to be.

All the same, we shared our data sets with our research partners at a U.S. business university in the northeast of the country, where PhD students are running independent analysis. We'll publish what we found in the future, but for now, we want to tell the story of the survey. We hope that others hear our story and decide to use this stakeholder analysis method too.

How We Did Our Youth Workforce Development and Mobile Learning Survey in Senegal:

1. We had a compelling reason for young people to participate.

When our research assistant was still very new to his job, he felt shy approaching and interviewing respondents because he was not telling them what we were doing and why it was important to them.

This is a classic case of the "features vs. benefits" sales mistake that goes something like this. Our research assistant would approach a young person in the late afternoon outside the university and tell them about our survey. "It was 35 questions and I'm from Code Innovation, a local tech company," he would say. "Can I interview you?" Most people would look at him blankly and, when learning that they weren't going to be compensated directly in any way, say no. This happened a lot and he began to get discouraged. Our assistant was focused on the "features" of the thing he was doing, in other words, what it was and how it worked.

When we began to work with him on the "why" of the story, people started to respond. By focusing on the benefits to them, people had a clear reason to say "yes" and get involved.

"Hi," our assistant would say, usually in Wolof, before asking the questions in French, "I'm working with a company that is building a free mobile app to help young people get a job. Would you answer some questions to help us with our project?"

The clearer our assistant was about communicating that benefit to respondents, the better things went.

2. What kind of data did we need to build our product?

Our idea for this survey was based on an observable and measurable need. As African economies and African cities grow over the next decades, young people need to know how to identify jobs that suit them and get the skills they need to negotiate their careers.

In our other start-up businesses, we'd seen the skill gap between where we wanted our new hires to be and where they actually were. The first few months of any new project would involve extensive step-by-step training and norming around professional and organization culture. This isn't something that's taught in career workshops, university or secondary school, but as an employer in Africa, it's a big 'X' factor in building a team and hiring.

We wanted to solve for it with an app that taught young African professionals how to enter the workplace and negotiate their careers.

3. What were we trying to build in the first place?

When we started the survey in 2012, most of Code's experience was in computer and Internet projects around e-mentoring and e-learning with at-risk and low-income secondary students around the world. We did not have experience building mobile apps, but we saw their potential for our demographic of young, urban, educated Africans.

We wanted to get enough data to know the following:

1) How were young people already using technology in their searches and along their career paths?

2) Were their strategies working or not working in terms of moving them towards their career goals?

4. What did the survey look like?

You can see a copy of the survey we used here in English and here in French. We had our Peace Corps Volunteer with a background in social science develop the questions with us, and train our research assistant in his first round of interviews.

What We Learned from Doing the Survey on Youth and Mobile Workforce Development:

1. This was easier to do than we thought and provided a good way of getting data for decision-making before we developed our project.

2. We didn't need as many respondents as we thought. Even though we only surveyed a quarter of the people we initially thought we would, we still had more than enough data for our analysis. In retrospect, we could've stopped at around 100, as long as those respondents were exactly within the required demographic and gender-balanced.

3. Inviting U.S. research universities to do our data analysis took the work off our hands and made our analysis verifiably independent. Also, we like to think that it was interesting for the students involved to learn a bit about African research contexts. We found the partnership to be very rewarding and highly recommend that other teams like us reach out to and work more closely with universities.

Thanks for reading this far! If you're interested in learning more about this, please feel free to get in touch (info@codeinnovation.com). We love to have conversations about technology for education and with others building in the African tech space.

Collaborating for monitoring and evaluation

seier seier.jpg

Whenever I hear people lamenting the difficulty of monitoring and evaluating projects and certainly whenever I encounter good projects with feeble monitoring efforts, I wonder why more people are not taking advantage of a widely-available expert workforce that regularly takes on new projects without asking for any payment.

All around the world and especially in the “publish-or-perish” universe of North American academia, there are researchers (professors, associate professors and graduate students alike) who are scouring the earth for new things to research and write about. For many academics, the biggest challenge is figuring out how to gather the data about which they hope to write. For them, this process is expensive, time-consuming and often subject to protective legislature and intimidating review committees.

Meanwhile, we in the development community are usually able to survey our target audiences as often as we like and with relative ease. When we put together our project plans and proposals, nobody will question our decision to invest real money in collecting data about our programs. At the very least, we can organize the implementation of basic print surveys and qualitative interviews. Where the development community seems to run a little bit weak is in the design of these surveys and, even more so, in their analysis. This is precisely where academia excels.

For nearly six years now, I’ve enjoyed a collaboration with social scientists at a U.S. university that has added incredible value to the monitoring and evaluation that I bring to a variety of social and educational projects. My partners in this effort have been gaining access to valuable data that they are motivated to analyze and discuss, some of which has proven publishable in academic journals. We both get exactly what we need from one another and we are all paid for our work by other sources. This means that our partnership operates in an unusually money and paperwork-free arena, which inoculates it against any number of potential stresses. Every time I approach them with a new potential project idea, they are interested in working on it. This still surprises me every time.

Here's what I suggest if you want to forge partnerships that improve your M&E:

1. When you are at the project design phase of a new initiative, consider the impacts that you are hoping to achieve. Then determine which academic discipline would have the greatest interest in your achievements. Create a contact list of academics who specialize in the area to which you will be making your contribution. In order to populate this list, I suggest you:

a. Prioritize academics at institutions that you and your team have attended or to which you and your project are affiliated. We all know the importance of networks.

b. Ensure that you approach some academics who are not already known and famous. If you identify some academics who would be interested in your project and you happen to know that they are already deeply involved in development work or with their successful careers in general, look at who else is in their department. Look for younger or associate professors with more to prove. Look into departments at universities that rival the university at which your famous academics are housed.

c.Consider reaching out to academics who come from one of the countries you are working in, or who come from the region that you are working in. Homesick professors are often quite willing to help and also require less background when they understand your context.

d. Social scientists, because of the general nature of their discipline, will often be your most appropriate points of contact, especially if you are trying to affect behavior change in a population.

2.Know what you want to prove before reaching out to potential academic partners. They will be able to help you a great deal in the practical matters of designing your M&E material; but they will be more likely to have confidence in you and your project if you can articulate clear objectives that you want to prove. Don’t get into this level of detail in your first correspondence; but have it sitting ready before you initiate the discussions.

3. Know what sort of data you will be able to provide them. Be ready to talk about how many people you will be able to survey and how often. Be mindful that your academic partners may need for you to cleanse your data of any identifying information before you share it with them—especially if you are working with young people. Showing an awareness that you might need to process your data for them will also help to build their confidence in your team.

4.Offer to involve your academic partners in drafting your surveys and in preparing any interview questions. Be open to their suggestions and to the possibility that they may suggest taking your M&E in some unforeseen directions that could improve the perspective and strength of your program. In general, treat them as partners and not as employees or contractors. At any point, they can drop your project cold.

5. Mix established matrices with new ones. Especially when you are dealing with social scientists, you may find that they suggest using already established matrices to measure, for example, happiness or self-confidence. Take advantage of established question sets even if they are not 100% relevant to your program. If they contain a solid portion of questions relevant to your program, it is worth including the entire set because of how much easier it will be for your academic partners to publish your results. Don’t be shy about trying to create new question sets and matrices of your own. This is an exercise that academics enjoy and a good team building experience.

6. Be clear, from the beginning, about any potential sensitivities of your donor or organization. Protect and insulate your partners from involvement in internal political discussions that don't concern their M&E. Spell out any limitations on how you can each speak about or publicize your collaboration and avoid misunderstandings by anticipating them. No other party to this equation will be able to do this for you. Also, don't assume that they would like you to pass their contact information to all of your colleagues without checking first. Respecting each other's boundaries is critical.

That’s it.

With some planning and focused outreach, you're project's M&E can be robust and externally validated. With a little time, you may even have results published in a reputable journal that reflect the accomplishments of your team. Plus, if your project is absolutely not working, your academic partners are not going to sugar coat it for you! Good times.