🏴☠️ ⚡️ Issue #12: Sprint No. 2 Wrap (Day 30 to 75)
Welcome! This newsletter is dedicated to acquiring and operating Micro SaaS firms. Join us every other Saturday morning for deal analysis, operating frameworks / templates, and other musings...
Time is flying, and we’re excited to update everyone on our progress heading out of Sprint #2 with roughly 2.5mths behind the wheel. All things considered, I’d give us a B+ and we’re generally pleased with performance 🙏.
Here’s a quick sneak peek of the substance that follows:
🥅 BY THE NUMBERS — Traction and rate of improvement against our weekly scorecard
🤘 EXECUTION & HIGHLIGHTS — Through the lens of our six-week OKRs
🚧 LEARNING & CALIBRATIONS — Macro learning lessons and beyond
Let’s get to it…
🥅 BY THE NUMBERS
WEEKLY SCORECARD:
We’re excited to debut a modern scorecard with this wrap, which is built with Databox and consolidates data across our Product Admin Panel, CRM (Hubspot) and Product Event / Usage Tracking (Innertrends). The next iteration will include data points to reflect customer success (once we migrate from Zendesk…) so we have a holistic view of the biz in a glance.
This sprint was focused on rounding out the fix and improve activities (eating our vegetables 😉), with a big focus on modernizing the support center / knowledge base and optimizing the first portion of the onboarding experience to position us for intentional user growth. We maintained base rate performance for ‘Trial Sign Ups,’ ‘New Paying Users’ and ‘Churn’ — which went according to plan. There was a notable uptick in a ‘New Paying Users,’ which is a lagging indicator / validation for our work on the onboarding front (aka providing a better onboarding experience today should lead to more paying users in subsequent weeks when the 14day trial period concludes), though it’s early days and we aren’t drawing any conclusions. Let’s dive in:
New Paying Users (#): We saw an interesting 133% pop here during the sprint. I’d like to suggest this is validation for the onboarding optimization efforts we released at the beginning of the sprint, though performance here is at odds with our decline in ‘Achieved Onboarded’ — more on this below.
Churn (#): Our hope is to maintain flat, or slightly net positive growth (i.e. new paying users = churned users) during the fix and improve stages of transformation, as we build the foundation for scalable growth. In other words, an unexpected leak in our oxygen tank (decline in new paying users or uptick in churn) would introduce more pressure and reduce margin for error. No cause for alarm here.
Trial Signups (#): Again, we haven’t put our foot on the growth pedal yet. With this in mind, we are psyched about the steady trickle of ~8 trial signups a week. Expect this number to jump big time in Sprint #3 and #4 (June to August), as we allocate resources accordingly.
Achieved Onboarded (%): To frame this up, please see the screenshot below of the SaaS Funnel, now more commonly referenced as a Bow Tie.
As you start to prioritize growth, the tendency is to invest in driving traffic to the top of the funnel or left side of the bow tie (e.g. acquiring leads or trail users). This is often premature, if there are opportunities to improve conversion rates down funnel, because your ultimate goal is paying users vs creating a ton of trail users that never convert. Said another way, expand the diameter of the pipe, then pump more water through. ‘Achieved Onboarded’ (aka the ‘aha moment’) is the first action to optimize for within broader context of the onboarding motion. We have established a thesis for the definition of ‘Achieved Onboarded’ and implemented the tooling to guide users and observe behavior within the product such that we can dis/prove the thesis.
Our target is 15% and we averaged 8.82% (down from 12.8%) with a lot of variance week to week. All in, it’s too early to determine if our thesis is on the right track. Once dialed, we’ll move on to optimizing the user guiding / prompts to achieve the remaining onboarding goals. Continue to look for improvement in ‘Trial to Paid Conversion,’ as we need to feel very confident here before going hard at growth. This is our #1 priority. (For more on ‘Smart Trials’ this is an excellent read.)
Trial to Paid Conversion: Logically, we should see an improvement in ‘Achieved Onboarded’ to then see improvement in the down stream ‘Trial to Paid Conversion’ metric. It follows then, that we are a bit surprised to see the 168% improvement here, when ‘Achieved Onboarded’ was pretty volatile week to week.
Next sprint, we need to see an improvement in ‘Achieved Onboarded’ to substantiate the (hopefully) consistent or improving ‘Trial to Paid Conversion’ rate to feel overall confidence about our strategy and execution optimizing the earliest portions of user onboarding.
YEAR-OVER-YEAR (YoY):
A YoY view helps to isolate / adjust for seasonality and interpret performance against year-to-date (YTD) trend lines. As mentioned, our goal re ‘New Paying Users,’ ‘Churn’ and ‘Trial Signups’ is steady state. YTD and YoY look good from this perspective.
🤘 EXECUTION & HIGHLIGHTS
SPRINT #2 OKRs (4/1 to 5/12)
OBJECTIVE #1: Improve % of users who achieve onboarded
KEY RESULTS & HIGHLIGHTS:
✅ Define 5x onboarding goals and detail associated technical steps — This effort started with a thesis re defining ‘onboarding’ and implementing trial user prompts in the form of email drips (via Hubspot) and in-app cues (via Userguiding). In the broader context of a fine-tuned, comprehensive onboarding process, you must have a bigger picture view on the next 3 - 5x critical outcomes (or value propositions) your SaaS delivers to users. You then need to map every single step / click required to achieve those outcomes so you can guide the user accordingly. We have an informed view here. Excited to nail ‘Achieved Onboarded’ so we can optimize for the next onboarding goal and giddy on up to growth.
✅ Build and release trial user nurture email sequence and in-app cues — Following on the above, we have a 4x email drip and very thoughtful in-app user guide live in the wild. ‘Trial to Paid’ conversions are stellar but the number of trial users who ‘Achieve Onboarded’ needs to normalize from week to week, lifting the average over time, before we can move on.
OBJECTIVE #2: Refresh online presence (website, blog, support center)
KEY RESULTS & HIGHLIGHTS:
✅ Release new support center / knowledge base — The legacy support content was spread across an Academy, a traditional support center and a facebook group. This makes it very difficult for users to quickly find the help they need in real time, which leads to support tickets and increased cost to serve. Not great. We were able to consolidate content from all three sources by migrating everything to a Hubspot hosted knowledge Bnase, which also provides search functionality to accelerate ‘discoverability’ of relevant content. This is only v1 of the planned support center improvements, though a HUGE win.
✅ Release new blog — This is set to go live next week and we’re thrilled to get the content engine going. This will serve as a foundation for improving organic search results and inbound website visitor traffic. Content is a slow burn that compounds over time, grateful we are planting seeds here.
😐 Announce brand refresh and release new website — This will very likely push into the next sprint, though a 6wk timeline for an entirely new website was certainly optimistic. Thrilled with the partner we’re working with and should have this box fully checked in the coming weeks.
OBJECTIVE #3: Build momentum for inbound (content velocity, back-linking)
KEY RESULTS & HIGHLIGHTS:
✅ Build and launch 6wk content calendar — There is a TON of foundational work here related to SEO Research and competitor audits, which culminated in a Keyword strategy and content calendar that maps accordingly. We also built an iterative playbook for developing each 6wk content calendar, which will reduce the time intensity moving forward. It goes without saying, LLM models have totally changed the resource requirements for these work streams and it is wild to apply this tooling to our operating models.
OBJECTIVE #4: Customer Support Playbook & Resource Plan
KEY RESULTS & HIGHLIGHTS:
✅ 0 unsolved historical support tickets — We introduced a lot of discipline re classification and record keeping for support tickets and were able to clear the deck. We’ll now have cleaner data and better visibility, into support ticket patterns, as well as volume and response time improvement rates. This is a foundation we can build on.
✅ Train LLM model on historical support tickets and best practice responses — It is a strange thing to exist in the midst of a paradigm shifting new technology such as LLMs and ChatGPT. We are getting a ton of leverage by training models on support ticket resolution, to then use the tool internally to generate a first cut of all support ticket responses moving forward. This has reduced the need and time intensity of onboarding ourselves to the business and streamlined support response time / accuracy.
PRELIM SPRINT #3 OKRs (5/15 to 6/23)
These are half baked and will be fully fleshed heading out of next week, though in the interest of providing a preliminary view on the path forward, here’s an early take on next up priorities:
OBJECTIVE #1: Improve % of users who achieve next activation goal
OBJECTIVE #2: Fine-tune internal LLM tool on best practice support ticket responses
OBJECTIVE #3: Refresh and reorganize support center content
OBJECTIVE #4: Define our ICP and core customer segments
🚧 LEARNING & CALIBRATIONS
Operating Cadence & The transition from fix / improve to OKRs
Out of the gate, we grouped activities by domain (e.g. customer success, product) and transformation stage (e.g. Fix, Improve, Grow). This was useful as a hard and fast way to get our thinking on the table and reach agreement on general priorities. In hindsight, before moving past planning, we should have then grouped fix, improve, and grow activities into crystal clear objectives. You just can’t beat the simplicity and purity of 3 to 5 objectives for the business in a defined period of time and beating the drum accordingly. Alignment / clarity = execution / performance.
Further, as we say over and over again, the pace you can move with Micro SaaS is the juice and appeal that gets most of us very excited. On the topic of operating cadence (aka the rhythm and pace applied to how work is done and organized), we knew the traditional 3mth quarter was way too long so we started with a month. This felt rushed to everyone. Too tight to deliver on the more substantive work-streams (aka objectives) and didn’t provide the space for reflection / learning / planning in-between sprints, which is sacred time despite the perceived lack of output.
‘Focus creates the space for performance’
You can move so fast in Micro SaaS with lean times made of individuals that punch way above their weight class. This lends well toward ‘biting off more than you can chew’. Tackling too much increases the cognitive load and context switching across the team. You end up with a bunch of stuff kinda well done vs a few things done at an elite level. I fear I fell victim to this set up and introduced too many priorities, to the detriment of execution across the board. Moving forward, expect much more tightly defined objectives, chunked according to level of effort and what can be executed on a six week timeline. By way of metaphor, we need to build muscle over time, as opposed to attempting to squat the rack out of the gate. Feeling out the right pace of execution is of course a learning process and I feel we’ve learned a lot. The base layer of muscle has us very excited 💪.
In PLG, onboarding = your sales process
I’ve spent my career generating growth as a function of human-based sales team and consultative sales process design / adherence. Operating leverage is hard to find, when the components are mostly human. In contrast, PLG is composed of data and automation; thoughtfully curating and coordinating every user prompt to demonstrate the product’s utility, build habit loops and secure loyal paying customers. The awesome side of the coin is non-linear growth, the tough side of the coin lies in the volume of variables and the time it takes to make legitimate observations. Further, the onboarding experience must constantly be measured, revisited and iterated on, in the same way you iterate on a product or a traditional sales process. I’m excited to continue studying this dimension of Micro SaaS and have raw respect for the masters of the realm.