How do we evaluate applications?

This explains how the membership team evaluates the applications to the GrowthX membership. We’ve also mentioned every parameter we consider during our curation process

How do we evaluate applications?
Do not index
Do not index
CTA Headline
CTA Description
CTA Button Link
I am a law only for my kind, I am no law for all.
Friedrich Nietzsche, Thus Spoke Zarathustra
There are various definitions of a holistic assessment. Still, almost all of them agree that, at its core, it’s about getting as much information as possible to the reviewer/curator about what an applicant is bringing to the community. If implemented appropriately, it can support a fair, inclusive, & incremental process that helps identify applicants that effectively meet mutual outcomes, needs, and goals. To understand how we look at assessments, we must first understand GrowthX, the community, and all the learning initiatives/experiences.

What is GrowthX?

The one-liner → A social learning platform for the top 1% growth leaders. The Elevator pitch “ GrowthX is a community of top business professionals accelerating their careers and companies. Right from early → mid → leadership roles at top product organizations” GrowthX gives folks the ability + motivation to drive accelerated outcomes. Here's how —
  • Enabling ability — By learning from top practitioners, not professors
  • Accelerating motivation Peer-led learning and an enriching community
  • Redefining outcomes — Coaching, personal branding and access to top companies

Who are the GrowthX members?

On the date of writing this piece, the community consists of 2629 members, out of which 33% are founders & leaders, 35% are product leaders & operators, 25% are marketing leaders & operators, and the 7-8% are folks with a varied background some doctors, lawyers, software devs, consultants, VCs we even have a ISRO scientist who worked on the PSLV. Together, we strive to create an environment that values practical outcomes, innate curiosity, and establishes a profound sense of belonging for each community member. Our shared drive to understand and solve growth problems brings us together.

What does learning look like in GrowthX?

Multiple learning formats & variety of objectives, the umbrella thought around learning is simple.
  • Learn a specific skill from the best operators in the world.
  • Learn from case studies & apply mental models to real products.
  • Learn from curated peers with sometimes contrary perspectives.
  • Learning by applying everything at your workplace & learning from experiments.
  • Finding joy in building a new social circle with extremely empathetic members.
On the global stage, GrowthX's 8-week immersion is unparalleled regarding its impact on the companies these professionals work for. This year alone, over 300+ founders in GrowthX drove a cumulative revenue of $150 million (₹1200 Cr); we are not even getting into the revenue impact made by every other leader and operator in the community. In 2023, we launched a new hybrid learning format “CRAFTs”. These 2/3 week long programs are designed to help founders, product managers & marketers solve specific problems at work and build muscle on different aspects of a growth role. They are built with just one thing in mind: practical outcomes at work. The community decides the problem statement, we identify top practitioners, and we work closely with them to develop frameworks that can be implemented at work on a Monday morning to drive an impact for our members.

Now that we’ve understood the core membership, let’s understand how we use rubrics to curate new members.


What are rubrics?

A rubric is commonly defined as a tool that articulates the expectations for an assignment/assessment by listing criteria & each criterion, describing levels of quality. Too hard to understand, we know — let’s simplify this.
Imagine a business school trying to admit top students into their programs. They will look at a few things.
  1. Evidence & information from multiple sources to gauge applicants’ knowledge, skills, experiences, and personal attributes.
  1. Use threshold scores that are determined using only measures of academic experience and cognitive skills, such as GPA, board exam marks, GRE / GMAT test scores.
  1. Give thought to the weighting of various components of the application, the order in which those components are reviewed, and who reviews them, to make the process as equitable and fair as possible
  1. Some even use rubrics to help ensure that curators/reviewers evaluate applicants consistently and in alignment with program goals. Here’s an example of a university looking to prioritize applicants who only have research experience:
    1. Component
      Maximum points
      Point values
      3 - 1 year UG + work/internship research 2 - 1 year of UG research 0-1 = less than a year 1-2 extra points for publications, posters, awards, etc.
      Letters of Recommendation
      3 - very strong letters 2 - moderately strong letters 1 - below average letters -1 - red flag in letters
      Work exp/CV
      2 - 2+ relevant years of expereince 1 - 1-2 years of relevant years of exp 1 extra for volunteer work
      Undergrad curriculum
      1 - extensive science coursework 1 extra for extra credits for curriculum being the top globally
      4 - 3.7 to 4 3 - 3.4 to 3.69 2 - 3.2 to 3.39 1 - 3.0 to 3.19
      Personal statement of purpose
      2 - suggests strong fit 1 - suggests good fit 0 - unclear fit -1 - poor fit 1 extra point for hardships, disadvantages
      GRE scores
      Leaving this out
      20-25 - Strong admit 17-19 - Admit 14-16 - Probable admit 10-13 - Probable deny 0-9 - deny
It’s wise to note that a rubric is intended as the beginning of a discussion, not the source of a firm admit/deny decision. Admission committees then reach final decisions through discussion and consensus involving faculty, program directors, coordinators, and administrators.
Remember that this process is over a long period, mostly with the least interaction with the university admissions team you are applying to. You are given vague answers for every step and even vaguer answers for not being fit or being a red flag.

This seems like a lot to process. We reviewed what GrowthX is & how applications are processed in general business schools. We’ll deep dive into how we evaluate applicants at GrowthX.

We call it the SER rubric.

S - Significant Contribution E - Empathy R - Resourcefulness

Why are we checking for these broadly?

We’ve identified ‌this pattern - positive communities result from extremely empathetic people trying to help each other. Plus, resourcefulness is a core trait for anyone building a compounding career - ask any leader you've ever worked with. A significant contribution is a direct signal for us to understand if you would act on the knowledge/ connections you will earn once you are inside the community.

How are we checking for these?

We have a simple application form that allows us to measure the relevancy of the outcomes our applicants want to achieve and the impact GrowthX can have on them. This means we do not accept applicants who we can’t give an extremely high ROI, period. Next, we invite applicants to a jamming session. This slightly higher commitment action helps us gauge if we attract only the highest intent applicants. The jamming session is designed to emulate how a boardroom at a product organization would solve a growth problem. Applicants are given a problem statement and some relevant information about the problem statement. They are then advised to use the internet and build their own diagnosis of the problem. Along with the membership team, the applicants jam on the problem, trying to reach an experiment/solution. We finally cool down with a short round of getting to know what they do in their personal lives, anything they feel comfortable sharing.

The format allows us to measure —

  1. Behavior in a peer-led situation (core to the community)
  1. Approach to problem-solving in a professional setup (analytical ability)
  1. Interest in solving ANY growth problem for ANY product (curiosity)
  1. Resourcefulness when they are given less time to research, diagnose, and build an experiment - which is generally the case for any business decision
  1. Willingness to share/collaborate with peers
  1. Choice of words, candor, presence on a professional call

Here’s what the GrowthX rubric table looks like —

Significant Contribution
First degree
Provides a statistically significant perspective to the conversation
Non-defensive attitude
Adding value by listening carefully to others’ points
Backing up someone else with relevant data
Cognisant of others’ time
Effective time management
Influences the conversation
Zoom call etiquette
Asking questions to clarify
Influences the conversation multiple times
Mindful of other peers’ contributions
Asking the right questions about the problem
Providing an incremental point to the conversation
Beginners mindset
Structured thought process with clear problem identification
Providing a statistically significant contrary perspective to the conversation
Willingness to collaborate
Second degree
Quality of point mentioned vs time used
Being open to disagreements
Being able to give pros and cons to experiments
Sharing their knowledge with other participants
Finding data quickly
Structured thought process with clear problem identification
Clarity in Assumptions vs Reality
Third degree
Cons & blockers for experiments/solutions
listening intently and providing value to their peers
Relevant answer to the section
listening intently and providing value to their peers
Finding relevant data
Jargon less conversation
The tone of voice while arguing or rebuttal
Max points

Points table that we use for the rubric

Maximum points
Point values
Significant Contribution
5 - If it’s a primary Root cause identified of the problem statement, backed up with data, 4 - If it’s secondary root cause identified of the problem statement, backed up with data 2-3 - If it’s a primary/secondary/tertiary Root cause of the problem statement 1-2 - Not a root cause but an industry problem 0 - Not a relevant answer to the discussion 1-2 extra points for examples, and mentioned in the time given to them without using any jargon -1,2 points for not making any relevant points even after taking time, or rephrase the problem statement
4-5 - Willingness to collaborate, user understanding, peer understanding, perfect choice of words and tone, humble, not repeating others’ points, 2-3 - User understanding, peer understanding, cognizant of others time 0-1- Willingness to collaborate, 1-2 extra points for zoom etiquette, videos on, mics on only while speaking, patient, self-awareness, -1,2 points for being aggressive, argumentative, not giving others time, not appreciating others’ points, and not following any ground rules
4-5 - Relevant data in the time provided, asking relevant questions to the problem, 2-3 - Clarity in assumptions vs reality 0-1 - Listening actively 1-2 extra points for being able to listen to peers and understand a problem not being explored, data that needs to be found based on ad-hoc, explaining a structure to approach problem-solving that has worked for them -1,2 points for mentioning wrong data, or factually wrong inferences, quoting just from an article without truth-seeking
16-21 - Strong fit 11-15 - Good fit 8-10 - Probable fit 5-8 - Probable deny 0-4 - Waitlist
Once the jamming session is done, the membership team spends 1-2 hours discussing each applicant, outcomes, and jamming session performance before making a final decision. This process allows us to understand better our applicants and who we think would be the right fit for the community.

Written by

Karan Nagarajan
Karan Nagarajan

Karan is the membership team lead at GrowthX. He has scaled digital distribution for various Enterprise & B2B organizations like ANSYS, NASSCOM, 10000 Startups, Capgemini, EcoEnergy Insights