What are Privacy-Friendly Analytics and Why Should Agencies Take Them Seriously?

June 30, 2023
0 minute read

This blog post was written by Chris Pattison, a former agency owner and the cofounder of Squeaky Analytics. Chris has spent 14 years working in agencies and SaaS companies, leading design and product teams to deliver high growth, award winning user experiences.

If you’re using analytics tools when helping your clients to improve their websites, you’ll be familiar with the ongoing concerns around privacy and data protection, but perhaps not where these concerns originate. Why do agencies need to pay attention to privacy and compliance regulation, and how can they continue to support their clients in this new privacy-centric world?

Well, as the co-founder of an analytics company, and a former agency owner, I can hopefully offer some helpful thoughts and advice. In this article, I’ll briefly outline where the recent concerns around privacy originated, how changes in data privacy have impacted traditional analytics tools, and the many upsides of switching to a privacy-friendly solution.

A digital wild west

AI Generated Science-Fiction Sheriff

With the emergence of Web 2.0 came a modern-day gold rush, but instead of picks, shovels, and gold, technology companies began capturing, storing, and analysing a new asset class: user data. This new wild west led to incredible profits and growth for many digital businesses and supported the rise of household names like Meta and Google, but it came at the expense of user privacy.

It wasn’t just big businesses either, a huge share of companies began installing analytics software on their websites and apps, hoping to leverage data to increase the quality of their goods and services, as well as their profits.

And, honestly, it worked! Data-driven businesses have been the proven winners in our digital world, using analytics data to power their decision-making company-wide.

So, what’s the big deal? Well, to put it simply, it has come at a cost. You see, knowing which products on your site sell best is vital. However, companies rarely stopped there as their analytics tools scooped up everything - names, email addresses, dates of birth, blood types, and anything else they could get their hands on.

The thing is, not only did most users not know this was happening, but most companies didn’t need so much data, and they didn’t have the data protection practices and controls in place to deal with it. This led to huge leaks, and immoral or illegal uses of personal data for company profit that support identity theft, political manipulation, and just about every other kind of digital crime you can imagine. Something had to give, and where companies wouldn’t act in the face of consumer backlash and scandal, regulators had to act.

Taming the beast

There has been a host of legislative responses to the rampant capture of personal data and the failures of data protection - perhaps most notable is the European Union’s General Data Protection Regulation (GDPR), which came into effect in May 2018.

Flag of Europe

The GDPR introduced stringent rules around data protection, including requirements for businesses to obtain clear consent from users before collecting personal data, and it also gave users the right to know what data is being collected and the ability to opt out. Non-compliance with GDPR can result in significant fines, up to 4% of a business's annual global turnover.

However, the EU are not alone in this mission:

  • 🇺🇸 The California Consumer Privacy Act (CCPA) came into effect in January 2020, offering similar protections for California residents.
  • 🇧🇷 In Brazil, the General Data Protection Law (LGPD) was enacted in August 2020, reflecting similar concerns around user data protection.
  • 🇮🇳 Similarly, India's Personal Data Protection Bill is due this summer (2023). Expected to be passed by July or August 2023.

These bills are all great examples of the growing global trend towards heightened user data protection. So, what does this mean for agencies and their clients?

Implications for agencies

Just as changing consumer preferences and emerging legislation have had an impact across digital industries, so too have they impacted agencies. Indeed, web agencies that continue to use legacy analytics tools are opening themselves up to several risks, including, but not limited to:

  1. Legal and financial risks of non-compliance: As we’ve already covered, data protection laws like GDPR and CCPA impose strict rules on the collection and processing of personal data. Conventional analytics tools tend to collect data indiscriminately and may not be in compliance with these laws. Non-compliance can result in hefty fines, as well as damage to you or your client’s reputation.
  2. Loss of customer trust: With growing awareness about data privacy issues, customers are becoming more wary of companies that collect their data unnecessarily and they expect reasonable and transparent data capture. If they find out companies have been capturing personal data without their consent, it can often lead to serious backlash and commercial consequences.
  3. Huge gaps in your data: Legacy analytics tools rely on cookie-based tracking, which has become synonymous with privacy-violating practices. Because of this, 42% of users now choose to reject the use of cookies, which means your cookie-based analytics product could have a 42% gap in the data it’s capturing. This means you and your clients will be making decisions without having the full picture of your user experience.
  4. Data Breaches: Legacy analytics tools often collect more data than necessary, including a lot of personal data on users. This makes them a prime target for cybercriminals, with resultant data breaches leaving you and your clients exposed.
  5. Failure to Future-Proof: With data privacy legislation only set to increase, sticking with older analytics tools means continually playing catch-up, this is expensive and will lead to you and your clients losing trust in the tools you’re using, the data being captured, and the insights on offer.


So where does that leave your agency? You want to offer your customers a competitive advantage by providing meaningful data on the performance of their website and their user experience, but you also recognise the evolution of consumer demands and the fast-changing regulatory landscape. The answer is simple, it’s time to switch to a more modern, privacy-friendly analytics tool.

Privacy-friendly analytics

In response to the growing need for privacy-centric solutions, a new breed of analytics tools has emerged, where avoiding the arbitrary capture of personal data is the number one priority. These tools are known as privacy-friendly analytics, and this is the category that my company, Squeaky, fits into.

Squeaky is one of the few future-proof and fully compliant analytics tools on the market, and it’s also the first web analytics tool available in the Duda app store, so let me take a moment to explain why I think it could be the right choice for you and your clients…

No compromises

Firstly, the most important thing to know about Squeaky’s privacy-first analytics is that you won’t be compromising on the quality of the data you capture or the value of the insights you’ll gain.

That may sound counterintuitive, but in reality, 95% of the functionality and value in an analytics tool comes from non-sensitive data, and this is most valuable when aggregated to analyse patterns and trends in your user experience. Privacy-friendly analytics solutions recognise this, and simply avoid defaulting to the use of invasive tracking technologies like cookies or IP-based tracking, or the capture of any other types of personally identifiable information (PII).


In addition, other key benefits of privacy-friendly analytics include:

  1. No compliance issues: Out of the box, Squeaky doesn’t collect any personal data on your website visitors. This means no worries about GDPR, CCPA, LGPD, or any other data protection law.
  2. No Cookies: Squeaky also doesn’t use cookies or IP-based tracking, which means no more "Accept Cookies" pop-ups, and no more missing data because of cookie rejection. You get a complete and accurate picture of user behaviour.
  3. Actionable Insights: By focusing on user behaviour, rather than user data, Squeaky avoids capturing sensitive data whilst still providing the insights and actionable analytics you and your clients expect.
  4. Future-Proofing: With Squeaky, you're not just complying with current regulations, you're anticipating future ones. As privacy regulations continue to evolve, Squeaky ensures you stay ahead of the curve, offering you and your clients peace of mind.
  5. User Trust: By using a privacy-friendly analytics tool like Squeaky.ai, you're showing your clients and their users that you respect their privacy. This can help to build trust and loyalty, which are invaluable in today's digital landscape.



Not only are you safeguarding the interests of you and your client, you’re also not missing out on features either, because Squeaky comes with all the standard functionality you’d expect, like analytics, heatmaps, session recording, event tracking, and more.

Conclusion

With the rise of data privacy concerns and stricter regulations, legacy analytics tools are no longer a viable option for most agencies. By switching to a privacy-friendly tool, agencies can provide better quality data capture and insights, avoid legal and financial risks, and build trust with their customers.

Moreover, privacy-friendly analytics tools like Squeaky continue to provide all the functionality you’ve come to expect in a great analytics product…oh yeah, and you can install it in Duda with just one click.

Privacy-first analytics in Duda

Did you find this article interesting?


Thanks for the feedback!
By Shawn Davis April 1, 2026
Core Web Vitals aren't new, Google introduced them in 2020 and made them a ranking factor in 2021. But the questions keep coming, because the metrics keep changing and the stakes keep rising. Reddit's SEO communities were still debating their impact as recently as January 2026, and for good reason: most agencies still don't have a clear, repeatable way to measure, diagnose, and fix them for clients. This guide cuts through the noise. Here's what Core Web Vitals actually measure, what good scores look like today, and how to improve them—without needing a dedicated performance engineer on every project. What Core Web Vitals measure Google evaluates three user experience signals to determine whether a page feels fast, stable, and responsive: Largest Contentful Paint (LCP) measures how long it takes for the biggest visible element on a page — usually a hero image or headline — to load. Google considers anything under 2.5 seconds good. Above 4 seconds is poor. Interaction to Next Paint (INP) replaced First Input Delay (FID) in March 2024. Where FID measures the delay before a user's first click is registered, INP tracks the full responsiveness of every interaction across the page session. A good INP score is under 200 milliseconds. Cumulative Layout Shift (CLS) measures visual stability — how much page elements unexpectedly move while content loads. A score below 0.1 is good. Higher scores signal that images, ads, or embeds are pushing content around after load, which frustrates users and tanks conversions. These three metrics are a subset of Google's broader Page Experience signals, which also include HTTPS, safe browsing, and mobile usability. Core Web Vitals are the ones you can most directly control and improve. Why your clients' scores may still be poor Core Web Vitals scores vary dramatically by platform, hosting, and how a site was built. Some of the most common culprits agencies encounter: Heavy above-the-fold content . A homepage with an autoplay video, a full-width image slider, and a chat widget loading simultaneously will fail LCP every time. The browser has to resolve all of those resources before it can paint the largest element. Unstable image dimensions . When an image loads without defined width and height attributes, the browser doesn't reserve space for it. It renders the surrounding text, then jumps it down when the image appears. That jump is CLS. Third-party scripts blocking the main thread . Analytics pixels, ad tags, and live chat tools run on the browser's main thread. When they stack up, every click and tap has to wait in line — driving INP scores up. A single slow third-party script can push an otherwise clean site into "needs improvement" territory. Too many web fonts . Each font family and weight is a separate network request. A page loading four font files before rendering any text will fail LCP, especially on mobile connections. Unoptimized images . JPEGs and PNGs served at full resolution, without compression or modern formats like WebP or AVIF, add unnecessary weight to every page load. How to measure them accurately There are two types of Core Web Vitals data you should be looking at for every client: Lab data comes from tools like Google PageSpeed Insights, Lighthouse, and WebPageTest. It simulates page loads in controlled conditions. Lab data is useful for diagnosing specific issues and testing fixes before you deploy them. Field data (also called Real User Monitoring, or RUM) comes from actual users visiting the site. Google collects this through the Chrome User Experience Report (CrUX) and surfaces it in Search Console and PageSpeed Insights. Field data is what Google actually uses as a ranking signal — and it often looks worse than lab data because it reflects real-world device and connection variability. If your client's site has enough traffic, you'll see field data in Search Console under Core Web Vitals. This is your baseline. Lab data helps you understand why the scores are what they are. For clients with low traffic who don't have enough field data to appear in CrUX, you'll be working primarily with lab scores. Set that expectation early so clients understand that improvements may not immediately show up in Search Console. Practical fixes that move the needle Fix LCP: get the hero image loading first The single most effective LCP improvement is adding fetchpriority="high" to the hero image tag. This tells the browser to prioritize that resource over everything else. If you're using a background CSS image for the hero, switch it to anelement — background images aren't discoverable by the browser's preload scanner. Also check whether your hosting serves images through a CDN with caching. Edge delivery dramatically reduces the time-to-first-byte, which feeds directly into LCP. Fix CLS: define dimensions for every media element Every image, video, and ad slot on the page needs explicit width and height attributes in the HTML. If you're using responsive CSS, you can still define the aspect ratio with aspect-ratio in CSS while leaving the actual size fluid. The key is giving the browser enough information to reserve space before the asset loads. Avoid inserting content above existing content after page load. This is common with cookie banners, sticky headers that change height, and dynamically loaded ad units. If you need to show these, anchor them to fixed positions so they don't push content around. Fix INP: reduce what's competing for the main thread Audit third-party scripts and defer or remove anything that isn't essential. Tools like WebPageTest's waterfall view or Chrome DevTools Performance panel show you exactly which scripts are blocking the main thread and for how long. Load chat widgets, analytics, and ad tags asynchronously and after the page's critical path has resolved. For most clients, moving non-essential scripts to load after the DOMContentLoaded event is a meaningful INP improvement with no visible impact on the user experience. For websites with heavy JavaScript — particularly those built on frameworks with large client-side bundles — consider breaking up long tasks into smaller chunks using the browser's Scheduler API or simply splitting components so the main thread isn't locked for more than 50 milliseconds at a stretch. What platforms handle automatically One of the practical advantages of building on a platform optimized for performance is that many of these fixes are applied by default. Duda, for example, automatically serves WebP images, lazy loads below-the-fold content, minifies CSS, and uses efficient cache policies for static assets. As of May 2025, 82% of sites built on Duda pass all three Core Web Vitals metrics — the highest recorded pass rate among major website platforms. That baseline matters when you're managing dozens or hundreds of client sites. It means you're starting each project close to or at a passing score, rather than diagnosing and patching a broken foundation. How much do Core Web Vitals actually affect rankings? Honestly, they're a tiebreaker — not a primary signal. Google has been clear that content quality and relevance still dominate ranking decisions. A well-optimized site with thin, irrelevant content won't outrank a content-rich competitor just because its CLS is 0.05. What Core Web Vitals do affect is the user experience that supports those rankings. Pages with poor LCP scores have measurably higher bounce rates. Sites with high CLS lose users mid-session. Those behavioral signals — time on page, return visits, conversions — are things search engines can observe and incorporate. The practical argument for fixing Core Web Vitals isn't just "because Google said so." It's that faster, more stable pages convert better. Every second of LCP improvement can reduce bounce rates by 15–20% depending on the industry and device mix. For client sites that monetize through leads or eCommerce, that's a revenue argument, not just an SEO argument. A repeatable process for agencies Audit every new site before launch. Run PageSpeed Insights and record LCP, INP, and CLS scores for both mobile and desktop. Flag anything in the "needs improvement" or "poor" range before the client sees the live site. Check Search Console monthly for existing clients. The Core Web Vitals report surfaces issues as they appear in field data. Catching a regression early — before it compounds — is significantly easier than explaining a traffic drop after the fact. Document what you've improved. Clients rarely see Core Web Vitals scores on their own. A monthly one-page performance summary showing before/after scores builds credibility and makes your technical work visible. Prioritize mobile. Google uses mobile-first indexing, and field data shows that mobile CWV scores are almost always worse than desktop. If you only have time to optimize one version, do mobile first. Core Web Vitals aren't a one-time fix. Platforms change, new scripts get added, campaigns bring in new widgets. Build the audit into your workflow and treat it like any other ongoing deliverable, and you'll stay ahead of the issues before they affect your clients' rankings. Duda's platform is built with Core Web Vitals performance in mind. Explore how it handles image optimization, script management, and site speed automatically — so your team spends less time debugging and more time building.
By Ilana Brudo March 31, 2026
Vertical SaaS must transition from tools to an AI-powered Vertical Operating System (vOS). Learn to leverage context, end tech sprawl, and maximize retention.
By Shawn Davis March 27, 2026
Automate client management, instant site generation, and data synchronization with an API-driven website builder to create a scalable growth engine for your SaaS platform.
Show More

Latest posts