Do websites count as wiretapping? California law firms think so

August 19, 2024
0 minute read

The information provided within this article does not, and is not intended to, constitute legal advice; instead, all information, content, and materials available within this article are for general informational purposes only. The information herein should not be used upon in regard to any particular facts or circumstances without first consulting a lawyer. Any views expressed herein are those of the author, who is not a legal professional. 


When most people imagine “wiretapping,” typically a website isn’t the first thing that comes to mind. You might be picturing a team of federal agents, ear to a headset, with a big tape recorder running in the background. What about a chatbot, though, or even a tracking pixel? Is that a form of wiretapping?


What is wiretapping?


Let’s take a look at the legal definition of wiretapping, as defined by the California Invasion of Privacy Act (CIPA). “Wiretapping” is defined by California Penal Code §631, according to the law firm Neal, Gerber, and Eisenberg, as “using a machine or instrument to intentionally make a connection via a line or cable to read or attempt to read the contents of a communication.”


This law also prohibits the use of both a “pen register” as well as “trap and trace” devices without explicit consent or a court order. The statute defines a “pen register” in §638.50 as “a device or process that records or decodes dialing, routing, addressing, or signaling information transmitted by an instrument or facility from which a wire or electronic communication is transmitted, but not the contents of a communication.” A “trap and trace” device is similar, it records incoming impulses, electronic or other, that identify the originating signal information and, by extension, the reasonable identity of the source of that information—again without collecting the contents of that transmission.


In short, a “pen register” records phone numbers or “routing” information that is outgoing from a device, while a “trap and trace” device records incoming routing information. The key here is that the statue defines these devices as capturing “routing information” and not necessarily phone numbers.


That all sounds pretty vague, huh? It certainly is, and that’s where the problem arises.


Website owners are being sued for wiretapping


Last year, a Southern District of California District Court denied a motion to dismiss in Greenley v. Kochava, according to experts at Husch Blackwell. In this case, the plaintiff argued that the developer of a software development kit (SDK) has violated CIPA by including code that forwarded the location information of the user, unwittingly, to the developer. They argued this was tantamount to a “pen registry” or “trap and trace” device, in that it revealed the source of a particular communication, even if it didn’t reveal the communication itself.


Keep in mind that this bill was written in 1967, so this interpretation is quite a stretch from the original interpretation. However, there are plenty of creative law firms out there. Plaintiffs have filed hundreds, if not thousands, of individual and class action lawsuits in California courts asserting new, creative applications of CIPA to new technologies, according to lawyers at Nixon Peabody.


These aren’t shots in the dark, either! Plaintiffs are seeing some level of success in their suits—though not all of them.


In Licea v. Hickory Farms, courts rejected the notion that collecting a visitor’s IP address constituted an illegal pen register. So while simple analytics is probably safe from a CIPA suit, it isn’t exactly clear what technologies do constitute an illegal wiretapping. However, there are a few common victims.


Plaintiffs' attorneys are primarily bringing claims against businesses for their use of chatbots, website session replay technologies, and pixel tracking technologies, according to Bloomberg Law.


  • Chatbots are being likened to what CIPA considers as a “secret” wiretap that allows third parties to listen in on a conversation without the users’ consent.
  • Website session replay technology, like Microsoft Clarity, may, according to plaintiffs, allow website owners to eavesdrop on private conversations for use in targeted advertisements.
  • Pixel tracking, a common target of many privacy laws—including the recently passed Colorado Privacy Act may allow businesses to surreptitiously collect information about user interactions and behaviors.


Frankly, the merits of these lawsuits aside, these are technologies you should be concerned about from a privacy perspective anyway. Dozens of states have enacted increasingly stringent regulations governing the use of these technologies that are far clearer than this archaic interpretation of a decades old California law.


Yet, what makes this law so enticing for plaintiffs is the payout. CIPA claims can impose statutory damages of up to $5,000 in fines per violation. Since each visitor counts as a violation, even a desolate website with only a couple hundred visitors per month could see a hefty bill totalling nearly a million dollars.


CIPA has a long reach


Despite the name, you don’t necessarily need to be based on California to be impacted by the California Invasion of Privacy Act. 


Experts at BakerHostetler recommend businesses situated outside of California being faced with, or concerned about, a CIPA suit ask themselves the following questions:


  • Is the company website California-specific (i.e., is the subject matter of the website specifically tailored to Californians)?
  • Does the company engage in activities to drive California residents to its website?
  • Does the company specifically profit from California website viewers (as opposed to viewers generally)?
  • Does the company profit from California-specific advertisements on its website?
  • What percentage of the website’s users are associated with a California address?
  • Where is the website hosted (i.e., within California or in a location specifically intended to increase the number of California users)?
  • Are the website’s terms and conditions and/or privacy policies aimed at Californians or users as a whole?
  • Does the website collect the same information on all users, or does it collect different information if the user is associated with a California address?


Businesses should also consider whether or not their servers are physically located in California. For modern distributed, cloud-based websites, that isn’t always clear. In fact, none of those questions are particularly straightforward to answer.


Stop these lawsuits in their tracks


All privacy laws, including CIPA, tend to have a uniting factor; user consent. A fantastic way to help your clients avoid these frivolous shakedown attempts is to request explicit permission to install cookies, record session information, or perform nearly any kind of behavior tracking.


Not only will this protect your client from scary, unnecessary demand letters, but it’ll also future-proof their website. This month alone new privacy laws went into effect in Florida, Oregon, and Texas, with Montana to quickly follow suit beginning October 1st. 


The United States federal government isn’t far behind, either. In April, a bipartisan, bicameral coalition unveiled a draft of the “American Privacy Rights Act,” following increasing public pressure to protect consumer privacy online.


Now is a great time for your clients to get their data houses in order, before it’s too late. Well written privacy policies, comprehensive terms of service, and effective cookie consent banners can work in tandem to protect your customers from aggressive privacy lawsuits like the CIPA ones we have seen appearing in courts across California.


Apps like Termly and Termageddon can dramatically simplify this process for your SMB customers, especially those without a dedicated legal team. Termageddon has written extensively about the threat of CIPA lawsuits, and is proud to offer a policy generator that explicitly protects against this specific threat.


While these lawsuits may be frivolous, an old saying still rings true, “better safe than sorry.”


Headshot of Shawn Davis

Content Writer, Duda

Denver-based writer with a passion for creating engaging, informative content. Loves running, cycling, coffee, and the New York Times' minigames.


Did you find this article interesting?


Thanks for the feedback!
By Shawn Davis April 1, 2026
Core Web Vitals aren't new, Google introduced them in 2020 and made them a ranking factor in 2021. But the questions keep coming, because the metrics keep changing and the stakes keep rising. Reddit's SEO communities were still debating their impact as recently as January 2026, and for good reason: most agencies still don't have a clear, repeatable way to measure, diagnose, and fix them for clients. This guide cuts through the noise. Here's what Core Web Vitals actually measure, what good scores look like today, and how to improve them—without needing a dedicated performance engineer on every project. What Core Web Vitals measure Google evaluates three user experience signals to determine whether a page feels fast, stable, and responsive: Largest Contentful Paint (LCP) measures how long it takes for the biggest visible element on a page — usually a hero image or headline — to load. Google considers anything under 2.5 seconds good. Above 4 seconds is poor. Interaction to Next Paint (INP) replaced First Input Delay (FID) in March 2024. Where FID measures the delay before a user's first click is registered, INP tracks the full responsiveness of every interaction across the page session. A good INP score is under 200 milliseconds. Cumulative Layout Shift (CLS) measures visual stability — how much page elements unexpectedly move while content loads. A score below 0.1 is good. Higher scores signal that images, ads, or embeds are pushing content around after load, which frustrates users and tanks conversions. These three metrics are a subset of Google's broader Page Experience signals, which also include HTTPS, safe browsing, and mobile usability. Core Web Vitals are the ones you can most directly control and improve. Why your clients' scores may still be poor Core Web Vitals scores vary dramatically by platform, hosting, and how a site was built. Some of the most common culprits agencies encounter: Heavy above-the-fold content . A homepage with an autoplay video, a full-width image slider, and a chat widget loading simultaneously will fail LCP every time. The browser has to resolve all of those resources before it can paint the largest element. Unstable image dimensions . When an image loads without defined width and height attributes, the browser doesn't reserve space for it. It renders the surrounding text, then jumps it down when the image appears. That jump is CLS. Third-party scripts blocking the main thread . Analytics pixels, ad tags, and live chat tools run on the browser's main thread. When they stack up, every click and tap has to wait in line — driving INP scores up. A single slow third-party script can push an otherwise clean site into "needs improvement" territory. Too many web fonts . Each font family and weight is a separate network request. A page loading four font files before rendering any text will fail LCP, especially on mobile connections. Unoptimized images . JPEGs and PNGs served at full resolution, without compression or modern formats like WebP or AVIF, add unnecessary weight to every page load. How to measure them accurately There are two types of Core Web Vitals data you should be looking at for every client: Lab data comes from tools like Google PageSpeed Insights, Lighthouse, and WebPageTest. It simulates page loads in controlled conditions. Lab data is useful for diagnosing specific issues and testing fixes before you deploy them. Field data (also called Real User Monitoring, or RUM) comes from actual users visiting the site. Google collects this through the Chrome User Experience Report (CrUX) and surfaces it in Search Console and PageSpeed Insights. Field data is what Google actually uses as a ranking signal — and it often looks worse than lab data because it reflects real-world device and connection variability. If your client's site has enough traffic, you'll see field data in Search Console under Core Web Vitals. This is your baseline. Lab data helps you understand why the scores are what they are. For clients with low traffic who don't have enough field data to appear in CrUX, you'll be working primarily with lab scores. Set that expectation early so clients understand that improvements may not immediately show up in Search Console. Practical fixes that move the needle Fix LCP: get the hero image loading first The single most effective LCP improvement is adding fetchpriority="high" to the hero image tag. This tells the browser to prioritize that resource over everything else. If you're using a background CSS image for the hero, switch it to anelement — background images aren't discoverable by the browser's preload scanner. Also check whether your hosting serves images through a CDN with caching. Edge delivery dramatically reduces the time-to-first-byte, which feeds directly into LCP. Fix CLS: define dimensions for every media element Every image, video, and ad slot on the page needs explicit width and height attributes in the HTML. If you're using responsive CSS, you can still define the aspect ratio with aspect-ratio in CSS while leaving the actual size fluid. The key is giving the browser enough information to reserve space before the asset loads. Avoid inserting content above existing content after page load. This is common with cookie banners, sticky headers that change height, and dynamically loaded ad units. If you need to show these, anchor them to fixed positions so they don't push content around. Fix INP: reduce what's competing for the main thread Audit third-party scripts and defer or remove anything that isn't essential. Tools like WebPageTest's waterfall view or Chrome DevTools Performance panel show you exactly which scripts are blocking the main thread and for how long. Load chat widgets, analytics, and ad tags asynchronously and after the page's critical path has resolved. For most clients, moving non-essential scripts to load after the DOMContentLoaded event is a meaningful INP improvement with no visible impact on the user experience. For websites with heavy JavaScript — particularly those built on frameworks with large client-side bundles — consider breaking up long tasks into smaller chunks using the browser's Scheduler API or simply splitting components so the main thread isn't locked for more than 50 milliseconds at a stretch. What platforms handle automatically One of the practical advantages of building on a platform optimized for performance is that many of these fixes are applied by default. Duda, for example, automatically serves WebP images, lazy loads below-the-fold content, minifies CSS, and uses efficient cache policies for static assets. As of May 2025, 82% of sites built on Duda pass all three Core Web Vitals metrics — the highest recorded pass rate among major website platforms. That baseline matters when you're managing dozens or hundreds of client sites. It means you're starting each project close to or at a passing score, rather than diagnosing and patching a broken foundation. How much do Core Web Vitals actually affect rankings? Honestly, they're a tiebreaker — not a primary signal. Google has been clear that content quality and relevance still dominate ranking decisions. A well-optimized site with thin, irrelevant content won't outrank a content-rich competitor just because its CLS is 0.05. What Core Web Vitals do affect is the user experience that supports those rankings. Pages with poor LCP scores have measurably higher bounce rates. Sites with high CLS lose users mid-session. Those behavioral signals — time on page, return visits, conversions — are things search engines can observe and incorporate. The practical argument for fixing Core Web Vitals isn't just "because Google said so." It's that faster, more stable pages convert better. Every second of LCP improvement can reduce bounce rates by 15–20% depending on the industry and device mix. For client sites that monetize through leads or eCommerce, that's a revenue argument, not just an SEO argument. A repeatable process for agencies Audit every new site before launch. Run PageSpeed Insights and record LCP, INP, and CLS scores for both mobile and desktop. Flag anything in the "needs improvement" or "poor" range before the client sees the live site. Check Search Console monthly for existing clients. The Core Web Vitals report surfaces issues as they appear in field data. Catching a regression early — before it compounds — is significantly easier than explaining a traffic drop after the fact. Document what you've improved. Clients rarely see Core Web Vitals scores on their own. A monthly one-page performance summary showing before/after scores builds credibility and makes your technical work visible. Prioritize mobile. Google uses mobile-first indexing, and field data shows that mobile CWV scores are almost always worse than desktop. If you only have time to optimize one version, do mobile first. Core Web Vitals aren't a one-time fix. Platforms change, new scripts get added, campaigns bring in new widgets. Build the audit into your workflow and treat it like any other ongoing deliverable, and you'll stay ahead of the issues before they affect your clients' rankings. Duda's platform is built with Core Web Vitals performance in mind. Explore how it handles image optimization, script management, and site speed automatically — so your team spends less time debugging and more time building.
By Ilana Brudo March 31, 2026
Vertical SaaS must transition from tools to an AI-powered Vertical Operating System (vOS). Learn to leverage context, end tech sprawl, and maximize retention.
By Shawn Davis March 27, 2026
Automate client management, instant site generation, and data synchronization with an API-driven website builder to create a scalable growth engine for your SaaS platform.
Show More

Latest posts