Mentorship
Founding a grassroots design mentorship program at my workplace
Overview
The goal is simple: empower designers to do their best work by cultivating a culture of connection and learning.
I firmly believe in the power of mentorship to foster meaningful connections, growth, and innovation. Wanting to cultivate this culture of learning at CNN, I spearheaded a mentorship program comprised of one-on-one matching and learning labs (monthly knowledge-sharing workshops). Since its launch, the program has hosted dozens of learning labs and connected dozens of designers with mentors, with participants rating their progress toward goals at 4.29/5 and overall satisfaction at 4.43/5.
Here's how I made it happen.
Step 0: Why mentorship?
Joining CNN Design right out of college, I vividly remember learning something new each time I poked around colleagues’ design files or shared in design critiques. As the youngest person on the team, I felt that there was something to learn from everyone. However, I noticed there was no established mechanism for gaining mentorship; for those wanting to mentor others, there were no shared resources to lean on. Inspired by the team’s wealth of knowledge, I felt I had the most to give to this effort.
Why I started a mentorship program 💡
To cultivate a culture of learning 🪴
To give back to senior colleagues who graciously took me under their wing 🪽
Step 1: Understanding the problem space
There were a few opportunities unique to my design org that I wanted to account for.
The design team at CNN is well-rounded across various domains, including animation, strategy, management, and content design, with further diversity across platform specialties such as CTV, web, and app; there is something to learn from everyone.
As a remote-first org with teams spread across the country, there is potential for isolation, making building meaningful connections even more valuable; having support feels good.
In light of the Warner Bros. Discovery merger and subsequent organizational changes, mentorship is a powerful tool for navigating this new landscape and facilitating knowledge transfer.
I shared my vision with my friend, Senior Staff Growth Product Designer Ron Rundus, who was excited to tackle this challenge with me.
How might we establish a grassroots mentorship program?
We began by gauging team interest and leadership support while clarifying the problems we wanted to solve.
Team interest: Determined widespread team interest through an internal survey and chats with colleagues.
Leadership support: Gained buy-in and commitment of resources by presenting a written proposal to design leadership.
Problem space: Gathered insights into best practices, challenges, and successful strategies by chatting with facilitators of other mentorship programs at WBD, as well as HR.
Initial interest survey
Brainstorming — What is mentorship?
Brainstorming — What I'm looking for
Step 2: What is mentorship? What is it not?
Having received widespread initial interest in this program, we set out to define our goals and principles. Our survey surfaced a range of needs and preferences, so clarity was essential.
What do we want to accomplish for our team? For the participants? How will we know if this program is effective?
Mentorship can mean many things; its definition is often conflated with coaching, teaching, and managing. Since we were starting from scratch, we had the opportunity to define what mentorship meant for our program based on the team’s needs. The highest-ranked topics of interest for both mentees and mentors were techniques and career advancement. Knowledge sharing, skill development, and career guidance are all valuable aspects of mentorship; how do they coexist?
How might we establish a grassroots mentorship program that facilitates knowledge sharing, skill development, and career guidance?
Most team members were interested in being a mentor, and nearly half wanted to be both a mentor and a mentee. This surfaced another factor that would affect our choice of structure: the organization possessed a high level of seniority and expertise; for peer-to-peer pairings, the managing aspect of mentorship would not make sense.
We also asked how we could help participants feel more supported. Many suggested having clear goals and timelines, as well as resources like leveling guidance, conversation starters, and guides for mentors.
"From previous experience, mentorship programs with clear goals and outcomes have been more successful" — Colleague, Interest survey
Given the large appetite for knowledge sharing and skill development, we decided to explore splitting the mentorship program into two formats: learning labs and one-on-one matching. Learning labs would be group sessions focused on hard skills such as animation and A/B testing. This part would launch first, and as we trialed its format, we’d have more time to plan and structure one-on-one matching, which would be the personalized aspect of the program. We envisioned the two parts being symbiotically enriching: i.e., a designer interested in sharpening hard skills could start with an animation learning lab and follow it with personalized mentoring; a designer interested in becoming more influential among cross-functional partners could sit in on learning labs while receiving tailored career guidance from a mentor.
Step 3: Planning one-on-one matching
I hosted a brainstorming session to flesh out the specifics of matching, scheduling, resources, success, and concerns with Ron and Robert Hoekman. Robert, a design director, was enthusiastic about the initiative and volunteered to support us. I created a FigJam file with boards for each of the listed topics, each with an overarching question and several logistical questions to discuss.
Drafting our guiding principles
Mentorship aligned with the “Stronger Together” tenet of the CNN Digital Cultural Behaviors: “We aspire to inform, engage, and empower each other because the work we do matters.” We added the following to this overarching goal.
Our guiding principles
Seek knowledge share knowledge
Provide structure and support via thoughtful matchmaking, documentation, and guidance on best practice
Avoid one-size-fits-all. Enable different working styles and preferences
Keep time expectations realistic: ~1–2 hours per week
Measure and improve through quarterly reviews and surveys
Matching
For matching, we created a mentor intake survey, a mentee intake survey, and a matching guide detailing how to weigh responses to each question.
Matching process
Intake surveys
Matching guide
Scheduling
We created a scheduling guide for mentees and mentors to lean on from the get-go. It provided a few suggestions on how often and how long a pair should meet (we suggested 30 minutes per week), but the actual scheduling was left up to the pair themselves.
We’d release pairings all at once and run the pilot program for a quarter. Our colleague, Sabine, reminded us of the importance of psychological safety in this program. Psychological safety can be fostered by clear expectations, support, and open communication. A helpful mechanism would be a built-in exit strategy. We chose to run matches for a quarter because it provided sufficient time to grow together in one area while enabling an easy exit for pairs that were not a good match. For those wanting to continue, we’d extend a maximum of two quarters with the same matches.
Resources
In terms of resources, we created a central resource bank including time expectations, milestones, a goal template, conversation starters, a personal user manual, a leveling chart, and a link to a shared Slack channel. We made two versions of these guidelines, one for mentors and one for mentees.
Time expectations, as previously discussed, would be a minimum of one hour per month for a period of three months. Milestone dates would be the midpoint and endpoint of the quarter; we’d use surveys to measure success at both points and have a brief sharing session at the endpoint. The SMART goals template helped to frame each pairing’s conversations for the quarter.
We developed distinct sets of conversation starters for mentees and mentors. For mentees, this included questions covering knowledge sharing, skill development, and career guidance, again providing optionality. For mentors, questions were more focused on helping their mentees hone in on specific work areas and identify how they could help. Examples of questions were “What is a career decision you are currently considering, and what factors are important for you to take into account?” and “What do you feel are your greatest strengths and weaknesses, and how can we work together to leverage your strengths and overcome any challenges?” The starters were purely optional resources—a safety net for anyone blanking, not a required script.
Robert created a personal user manual that allowed individuals to instruct others on the best ways to work with them, covering preferred communication styles and motivations. We obtained an updated leveling chart from leadership, detailing both individual contributor and managing tracks. We also made a Slack channel to share mentoring news, tools, tips, techniques, and general updates.
Success
Measuring success was crucial not only for understanding the program’s effectiveness but also for ensuring continued leadership support and team participation.
What does success look like?
A flexible yet durable framework that supports diverse mentoring styles
Clear, measurable outcomes demonstrated through participation, surveys, and progress toward mentee goals
Success hinged on general program feedback, participants’ personal experiences, and feedback on individual mentors. We collected general program feedback through anonymous surveys, which included ratings and short-answer questions at both the midpoint and endpoint. We planned to use the Likert scale to measure progress toward goals and overall satisfaction, with a score of 4 out of 5 being our target KPI for the pilot. The mentorship Slack channel was also utilized for informal feedback. We tracked personal experiences by having participants share their progression toward the goals they set at the start; the optional end-of-program sharing session supplemented the quantitative feedback we received. Finally, we collected feedback on mentors through HR check-ins.
Concerns
During our brainstorming session, we also listed potential concerns. Some of the problems we anticipated were relationships fading over time, too much time commitment, and participants leaving CNN in the middle of the quarter (e.g., summer interns). We believed our structured, fixed-term solution could address potential issues related to retention and engagement (HBS).
Before launch, Ron and I also shared our plan with the team’s HR representative. To our surprise, in that call, we were told to implement power hours instead of one-on-one matching, and for HR, not us, to create the matches. Prepared to defend our pilot plan and seeking to understand where these solutions were coming from, we set up another meeting with HR. What was really the problem? During this meeting, we discovered that the miscommunication stemmed from the skills list we shared consisting mostly of hard skills, whereas we had actually envisioned our mentorship program to primarily help with soft skills. To address this, we added more soft skills, such as storytelling, giving and receiving critique, and career pathing, to our intake survey categories.
We found that the second point—having HR create matches for us—was also rooted in miscommunication. Our HR representative had been falsely informed that there was an automated matching program; however, we were unable to find such a program within the company and would need to create the pilot matches ourselves. HR also shared their concerns about peer-to-peer matching with us. To this point, we added an explicit opt-in to the intake form, explaining that for skill-based mentorship, mentees could be matched with mentors at lower org-chart levels. We’d still factor in org-chart level for career-based mentorship, but for skill development, we believed org-chart level was irrelevant.
Survey question
We also reiterated to HR that the matching guide detailed anonymizing responses, matching based on a set rubric, and utilizing third-party checks (leadership or HR) to produce unbiased matches.
Step 4. Piloting one-on-one matching
We walked through the plan once more before launch, starting with the split between learning labs and one-on-one matching. We asked ourselves, why and for whom? For example, if a designer were looking for Figma training, would they turn to one-on-one mentor matching? Perhaps not, but if one were looking to navigate the promotion process, one-on-one matching would be a great place to start. With that, we set up our shared Slack channel and opened up matching, creating six matches for the pilot.
During our midpoint check-in, we found that the most visible value-add of mentorship was connecting with other designers. In addition, mentees valued meaningful connections, career advice, and encouragement. Mentors valued the opportunity to develop mentorship skills essential for advancing into management roles as well as gaining different perspectives.
Presentation of survey findings
A critical piece of feedback we received was that mentors with mentees who entered the program without clear goals struggled to structure meetings and track progress effectively. This indicated that structure and framing would enable more productive matches. We noted down some action items for the next cohort: (1) utilize the mentorship kickoff meeting to explain the resources provided; (2) create a standardized first conversation template; (3) share practical articles on effective mentoring.
What mentors found valuable
“Hearing from others that my mentee had grown and improved through the course of the mentorship. It was validating and rewarding to know the work had an important effect” — Mentor
"I’m looking for ways to build more formal mentoring experience as part of growing into design management, and this has been a great opportunity to do that” — Mentor
What mentees found valuable
”Having the chance to ask questions that I would have otherwise needed to seek answers for outside the organization” — Mentee
"Having someone to chat with about how to manage challenges in the workplace and navigating stakeholder management — Mentee
As the cohort neared its end, we discovered that participants who came in with clear goals, even if the goals were not in specific skill areas, were satisfied with their progress toward those goals. On the contrary, not having clear objectives from the beginning made it difficult to measure progress and remain focused. Additionally, balancing mentorship with high-priority projects was challenging for some. We tossed around the idea of having a set time blocked out globally on calendars for all mentorship matches to meet, but we ultimately decided to leave scheduling up to the matches, as we felt this would cross the line between helpful structure and too much structure. We also received feedback to provide more warning on when the cohort would end and what would happen after it ended. Along the lines of connection and transparency, we decided to host a sharing session for mentees to share what they’d worked on with the rest of the team.
Reflecting upon our progress in achieving our org’s “stronger together” tenet, for the next cohort, our goals as the founders were to feel more connected to the program, have a better-structured kickoff and regular check-ins (while keeping the joy and flow of the program), and to understand the impact the program had made on participants. We created six more matches for the second cohort, with one pair being renewed from the pilot. As it was our second time running the program, we felt more familiar with it and were focused on maintaining communication with participants. This time, we posed a blue-sky question in the survey to encourage creativity and identify opportunities, finding mentors interested in more resources on coaching and even time to connect with other mentors. Interestingly, the feedback we received revealed conflicting preferences between more and less structure. It was important for our program to remain flexible in supporting diverse mentoring styles.
The success of the first two cohorts validated the program’s value. In the next cohort, we moved from validating its concept to focusing on long-term sustainability by optimizing the matching and onboarding processes to be scalable and repeatable, ensuring the program would thrive and evolve independently in the years to come. The resources we created provided the foundational infrastructure for continuity, as we worked on decoupling the program's success from Ron and my direct involvement.
Step 5. Hosting learning labs
Our vision for learning labs was a skill-share curriculum of tools and techniques training facilitated by design and external team colleagues. Dedicating bi-weekly time slots for knowledge sharing would not only create a space for team learning but also build an evergreen resource bank of past recordings for easy reference.
Mission statement
The Design Mentorship Learning Lab series is a twice-monthly series of practical, inspirational presentations to advance our team's combined expertise through a shared learning experience.
While I could envision what learning labs would look like, this vision alone was too vague. One question was how we would choose topics; a quick team survey yielded an initial list of topics and volunteer speakers. We prioritized topics that were relevant, valuable, and user-focused. As with our design projects, we vowed to measure and improve.
How will we choose topics?
Relevant — What topics would be most beneficial to the team at the time of the presentation?
Valuable — Will attendees get maximum value out of a session? How does it improve our day-to-day design needs?
User-focused — Use the original survey results for guidance. Gain more input when needed.
Measure and improve
With leadership support, we secured a slot in the design monthly to advertise upcoming learning labs. So, in January 2023, we hosted our first learning lab, a session on creating personal user manuals. This was the invite.
Email Invite
This was followed by sessions on A/B testing, providing design feedback, accessibility, and storytelling with data, as well as hands-on workshops on micro-animation and AI art. The format of the sessions varied, for example, an hour split into a 20-minute presentation, 15-minute small group discussion, 15-minute hands-on application, and 10-minute large group discussion.
Step 6. Refining learning labs
At this point, we were starting to encounter difficulty with producing consistent speakers at a bi-weekly cadence. Contacting speakers, manual scheduling, sending invites and reminders, and facilitating sessions were also taking up a non-insignificant chunk of time for Ron and me. Participation was waning due to busy schedules and time zone differences—the team’s initial excitement had faded.
During the next learning lab, we hosted a retro discussion on what we liked, disliked, and wanted to change. We found that our peers appreciated learning general skillsets and having dedicated team time. Meeting conflicts were an issue, but recordings were appreciated. There was an interest in design system training and workflow-focused learning labs: more org-specific learning rather than general. We shared this feedback with the design systems team to collaboratively action on.
Retro takeaways
We asked ourselves, what could someone take away from our talks and put to work the next day? It became clear that relevance and value, in the context of learning labs, were tactical skills that designers used daily and were oftentimes specific to the org.
Learning lab bank
To prioritize providing actionable takeaways, we needed to focus on quality over quantity, paring down learning labs to once a month. Around this time, design leadership had also begun implementing a recurring Peer Design Jam (PDJ) meeting for in-progress work and announcements. PDJs were slotted for one hour every Monday. This was the perfect opportunity to host learning labs, as it resolved our scheduling issue and prevented our peers from having to make an additional time commitment.
We began using 40 minutes of the first PDJ each month to host learning labs on valuable tactical skills. The process of scheduling and producing speakers became much smoother when viewing learning labs as a unique space to learn immediately usable tactical skills. I am proud of Ron and myself for ultimately creating space for designers to connect, gain inspiration, and learn from one another.
Impact and What I learned
The journey of establishing the grassroots CNN Design mentorship program was a lesson in strategic execution and organizational maturity. I learned the balance of patience and experimentation, recognizing when to meticulously plan for buy-in versus when to simply hit the ground running. By focusing on clear structure, defined expectations, and measurable metrics (progress toward goals at 4.29/5 and overall satisfaction at 4.43/5), we not only validated the program’s role in fostering individual and organizational growth but also built the foundational infrastructure for its future.
Iterating upon feedback allowed us to adapt and scale the program to fit the team’s needs, ultimately connecting dozens of designers with mentors and hosting numerous learning lab topics. The program’s sustained success is now ensured through a scalable, repeatable framework that cements its place as an evergreen resource for CNN Design. Mentorship empowers designers to do their best work by cultivating a culture of connection and learning, and I am grateful to have founded something at work that continues to evolve and grow with the team.












