As engineers, we often focus on building and optimizing features, but what about understanding how users interact with our work after launch? Analytics provides us with crucial insights that can transform our approach. By leveraging data, we can answer key questions like:
Are users engaging with our features? 🤔
Where are they encountering difficulties? ⚠️
How can we enhance their experience? ✨
Here’s a guide on making analytics work for you as an engineer.
Skipping Analytics ❌
One of the biggest missteps is deciding not to add analytics to certain features. This can happen when we feel the data is unnecessary or just eager to ship a new feature. But skipping analytics is like launching a rocket without tracking — it leaves us in the dark about user behaviour.
Adding analytics, even if it seems small, is like putting markers along a path. It gives you data to look back on, helping you see what's working and what isn't.
Using Multiple Analytics Tools 🛠️
With growth, it’s tempting to introduce new tools for each product segment. In my experience, however, this can create more problems than it solves. Initially, we used Plausible and Umami for different small products and Redash for community insights. Each tool had its way of storing data, making it challenging to gather a unified view.
Challenges with Multiple Tools:
Inconsistent Data 📊: When using multiple analytics tools, each one tends to track and store data in its unique way. This inconsistency can lead to difficulties in obtaining a cohesive and aligned understanding of the insights.
For example, one tool might track user engagement one way, while another does it differently, leading to reports that don't match up and are tough to piece together.Higher Maintenance ⚙️: Managing multiple tools takes more effort and resources. Each tool might need regular updates, configuration changes, and troubleshooting, which can make your workflow feel scattered.
This approach can slow down how quickly you analyze data and apply insights because you spend time jumping between different platforms and making sure each one is working properly.
Confusion 🤷♂️: When data is spread across different tools, it can confuse team members. This makes it hard for teams to trust the data because differences between tools can cause doubts about accuracy.
This makes decision-making harder, as teams might be unsure about using insights that don't come from a single, reliable data source.
Tip: Stick to just one or two trusty tools for collecting data across your products. It'll save you loads of time and make it way easier to use your insights.
The Learning Curve 📈
The first time I wrote a query, it felt overwhelming, especially since I am primarily a front-end developer. I remember starting with something simple like:
SELECT * FROM h_hackers h WHERE h.email = "notgoingtosharepublicly@gmail.com"
That query took forever to run because I had no clue which fields mattered. At first, every query felt like a struggle — lots of trial and error with table names, fields and joins. But once I got the hang of SQL, I could explore data on my own, spot patterns, and get answers way quicker.
Why Learning Queries Matters:
Saves time waiting on the analytics team (if present).
Gives direct access to insights that may be buried in dashboards.
Lets you troubleshoot product issues independently.
Understanding User Perspectives 👥
Engineers are often evaluated on what we ship, so focusing on analytics doesn’t always feel like a priority.
However, analytics gives us valuable insights into how users navigate our product, where they get stuck and what features they use most. Reviewing session data, user paths, and patterns in real-time allows us to make data-driven adjustments.
Ways to Use Analytics for User Insights:
Track Common User Paths: Understand typical workflows and catch any issues users might face along the way.
Identify Drop-off Points: See where users abandon tasks, which can signal usability issues or confusing flows.
Spot Usage Patterns: Notice trends around feature usage, which helps in prioritizing improvements.
A small UI tweak can improve your completion rate by X% without a full redesign.
Session Replays 🔄
Session replays are powerful for spotting problems hidden behind the numbers. This tool isn’t just for diagnosing bugs but it’s also essential for finding UX issues that users might not report.
When data looks “off” replays allow you to see user interactions and identify potential pain points.
Spot UX Issues: Understand how users navigate and see where they get stuck.
Identify Product Flow Problems: Find parts of the feature that are confusing or unnecessary.
Validate Hypotheses: Test if your assumptions about user behaviour match reality.
DLI (Data that Lasts Insights) 📅
Engineers typically focus on quantitative metrics like load times and query speeds, but these alone don’t give a complete picture. DLI is about long-term impact—observing how feature changes affect user engagement over time rather than in the first days post-launch.
Tips for Tracking DLI:
Avoid overreacting to early data fluctuations.
Look for sustained changes in user behaviour over weeks or months.
Evaluate if new changes increase engagement or retention in the long term (very important).
Sending Clean Data
As companies grow, data can get pretty messy if it's not organized well, making it tough to get reliable insights. Setting up standards early on for consistent, usable data is super important. Here are some tips from a developer's point of view to keep your data clean:
Consistent Naming Conventions: Use lowercase, present-tense, and camelCase (e.g.,
ClickSignupButton
) for event names and property keys. This consistency makes it easier to read, maintain, and query your analytics data.Versioning Events: As your application evolves, implement versioning for events (e.g.,
signupV2
). This allows you to compare old and new event data without losing historical context, aiding in debugging and data analysis.Type Safety: Use TypeScript or similar tools to define data types for your analytics events. This ensures that the properties sent to your analytics platform match expected formats, reducing errors during data collection.
Utilize Feature Flags: When launching new features with associated analytics, use feature flags to control the rollout. This lets you test the impact on user behaviour without affecting all users, and you can adjust your analytics tracking as needed.
Adding analytics to your engineering workflow might feel like it's not as important as building and shipping features, but the perks are totally worth it. Getting comfy with data means you're not just making features — you're crafting products that really connect with users and offer lasting value. 🌟
Digging into data, whether it's writing queries or checking out session replays, gives you a wider view that helps you become a better, more user-focused engineer.