Tag Archives: small data

The Human Algorithmic Bias

Ensuring Small Data Counters Big Data Blind Spots

The Human Algorithmic Bias

GUEST POST from Chateau G Pato
LAST UPDATED: January 25, 2026 at 10:54AM

We are living in an era of mathematical seduction. Organizations are increasingly obsessed with Big Data — the massive, high-velocity streams of information that promise to predict customer behavior, optimize supply chains, and automate decision-making. But as we lean deeper into the “predictable hum” of the algorithm, we are creating a dangerous cognitive shadow. We are falling victim to The Human Algorithmic Bias: the mistaken belief that because a data set is large, it is objective.

In reality, every algorithm has a “corpus” — a learning environment. If that environment is biased, the machine won’t just reflect that bias; it will amplify it. Big Data tells you what is happening at scale, but it is notoriously poor at telling you why. To find the “why,” we must turn to Small Data — the tiny, human-centric clues that reveal the friction, aspirations, and irrationalities of real people.

Algorithms increasingly shape how decisions are made in hiring, lending, healthcare, policing, and product design. Fueled by massive datasets and unprecedented computational power, these systems promise objectivity and efficiency at scale. Yet despite their sophistication, algorithms remain deeply vulnerable to bias — not because they are malicious, but because they are incomplete reflections of the world we feed them.

What many organizations fail to recognize is that algorithmic bias is not only a data problem — it is a human problem. It reflects the assumptions we make, the signals we privilege, and the experiences we fail to include. Big data excels at identifying patterns, but it often struggles with context, nuance, and lived experience. This is where small data — qualitative insight, ethnography, frontline observation, and human judgment — becomes essential.

“The smartest organizations of the future will not be those with the most powerful central computers, but those with the most sensitive and collaborative human-digital mesh. Intelligence is no longer something you possess; it is something you participate in.” — Braden Kelley

The Blind Spots of Scale

The problem with relying solely on Big Data is that it optimizes for the average. It smooths out the outliers — the very places where disruptive innovation usually begins. When we use algorithms to judge performance or predict trends without human oversight, we lose the “Return on Ignorance.” We stop asking the questions that the data isn’t designed to answer.

Human algorithmic bias emerges when designers, decision-makers, and organizations unconsciously embed their own worldviews into systems that appear neutral. Choices about which data to collect, which outcomes to optimize for, and which trade-offs are acceptable are all deeply human decisions. When these choices go unexamined, algorithms can reinforce historical inequities at scale.

Big data often privileges what is easily measurable over what truly matters. It captures behavior, but not motivation; outcomes, but not dignity. Small data — stories, edge cases, anomalies, and human feedback — fills these gaps by revealing what the numbers alone cannot.

Case Study 1: The Teacher and the Opaque Algorithm

In a well-documented case within the D.C. school district, a highly-regarded teacher named Sarah Wysocki was fired based on an algorithmic performance score, despite receiving glowing reviews from parents and peers. The algorithm prioritized standardized test score growth above all else. What the Big Data missed was the “Small Data” context: she was teaching students with significant learning differences and emotional challenges. The algorithm viewed these students as “noise” in the system, rather than the core of the mission. This is the Efficiency Trap — optimizing for a metric while losing the human outcome.

Small Data: The “Why” Behind the “What”

Small Data is about Empathetic Curiosity. It’s the insights gained from sitting in a customer’s living room, watching an employee struggle with a legacy software interface, or noticing a trend in a single “fringe” community. While Big Data identifies a correlation, Small Data identifies the causation. By integrating these “wide” data sets, we move from being merely data-driven to being human-centered.

Case Study 2: Reversing the Global Flu Overestimate

Years ago, Google Flu Trends famously predicted double the actual number of flu cases. The algorithm was “overfit” to search patterns. It saw a massive spike in flu-related searches and assumed a massive outbreak. What it didn’t account for was the human element: media coverage of the flu caused healthy people to search out of fear. A “Small Data” approach — checking in with a handful of frontline clinics — would have immediately exposed the blind spot that the multi-terabyte data set missed. Today’s leaders must use Explainability and Auditability to ensure their AI models stay grounded in reality.

Why Small Data Matters in an Algorithmic World

Small data does not compete with big data — it complements it. While big data provides scale, small data provides sense-making. It highlights edge cases, reveals unintended consequences, and surfaces ethical considerations that rarely appear in dashboards.

Organizations that rely exclusively on algorithmic outputs risk confusing precision with truth. Human-centered design, continuous feedback loops, and participatory governance ensure that algorithms remain tools for augmentation rather than unquestioned authorities.

Building Human-Centered Algorithmic Systems

Countering algorithmic blind spots requires intentional action. Organizations must diversify the teams building algorithms, establish governance structures that include ethical oversight, and continuously test systems against real-world outcomes — not just technical metrics.

“Algorithms don’t eliminate bias; they automate it — unless we deliberately counterbalance them with human insight.” — Braden Kelley

Most importantly, leaders must create space for human judgment to challenge algorithmic conclusions. The goal is not to slow innovation, but to ensure it serves people rather than abstract efficiency metrics.

Conclusion: Designing a Human-Digital Mesh

Innovation is a byproduct of human curiosity meeting competitive necessity. If we cede our curiosity to the algorithm, we trade the vibrant pulse of discovery for a sterile balance sheet. Breaking the Human Algorithmic Bias requires us to be “bilingual” — fluent in both the language of the machine and the nuances of the human spirit. Use Big Data to see the forest, but never stop using Small Data to talk to the trees.


Small Data & Algorithmic Bias FAQ

What is the “Human Algorithmic Bias”?

It is the cognitive bias where leaders over-trust quantitative data and automated models, assuming they are objective, while ignoring the human-centered “small data” that explains the context and causation behind the numbers.

How can organizations counter Big Data blind spots?

By practicing “Small and Wide Data” gathering: conducting ethnographic research, focus groups, and “empathetic curiosity” sessions. Leaders should also implement “Ethics by Design” and “Explainable AI” to ensure machines are accountable to human values.

Who should we book for a keynote on human-centered AI?

For organizations looking to bridge the gap between digital transformation and human-centered innovation, Braden Kelley is the premier speaker and author in this field.

Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.

Image credits: Google Gemini

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

How to Use Human-Scale Insights to Pivot Strategy

Small Data, Big Impact

LAST UPDATED: December 5, 2025 at 3:32PM

How to Use Human-Scale Insights to Pivot Strategy

GUEST POST from Chateau G Pato

Your analytics dashboard can tell you what happened: 70% of users abandoned the checkout process at Step 3. Big Data is superb at identifying this pattern. But it is fundamentally incapable of telling you why that abandonment occurred. Was the font confusing? Was the payment system counter-intuitive? Did the user get distracted by a child? The answer to the why requires Small Data.

Small Data refers to the qualitative, non-numerical, contextual information collected through human observation, deep empathy, and ethnographic research. It is the core of Human-Centered Innovation. Strategy that pivots based solely on aggregated trends risks being perpetually incremental. True, disruptive pivots are always rooted in a single, profound Human-Scale Insight — the realization of an unmet need that Big Data cannot quantify because the need is emotional, procedural, or cultural.

The Three-Step Small Data Strategy Pivot

To effectively leverage Small Data, organizations must embed a simple, three-step human-centered process:

1. Embrace Ethnographic Immersion (Discovery)

Strategy cannot be designed purely from behind a desk. Leaders must mandate and participate in ethnographic immersion. This involves frontline engagement: watching how a customer actually uses a product in their home, observing the communication patterns of a surgical team, or shadowing a field technician. The goal is to collect thick description — detailed, contextual field notes that capture the environment, mood, and exact procedural friction points. This practice requires organizational humility and a commitment to unlearn existing assumptions about the customer.

2. Synthesize for “Job-to-be-Done” (Analysis)

Once Small Data is collected, the analysis must focus on the Job-to-be-Done (JTBD) framework. JTBD moves analysis away from product features toward human motivation. Instead of asking, “Why did they buy our software?” ask, “What progress was the customer trying to make in their life when they hired our software?” The qualitative data often reveals that customers hire your product for a completely different job than you think. This Human-Scale Insight is the most common driver of strategic pivots because it exposes an entirely new market definition.

3. Operationalize the Anecdote (Action)

The single greatest challenge for Small Data is scaling it up against the perceived weight of Big Data. To pivot strategy, the Human-Scale Insight must be translated into a compelling narrative and immediately tested as a Minimum Viable Product (MVP). The anecdote must be operationalized. Instead of saying, “We should change the user interface,” say, “During the home visit, Jane mentioned she feels anxious when the software asks for her social security number three times. We need to test an MVP that reduces that anxiety by asking once and explaining the ‘why’ with clear, non-legalistic language.” This grounds the change in empathy and provides clear, immediate action.

Case Study 1: The Insurance Company’s Claims Process Pivot

Challenge: Low Digital Adoption Despite App Redesign

A major insurance provider (“SecureCo”) launched a highly publicized, expensive app redesign to modernize its claims process. Big Data analytics confirmed the app was technically sound, yet 80% of major claims were still submitted via phone call or physical mail. The Big Data showed what was happening, but offered no useful path for a strategic pivot.

Small Data Intervention: Ethnographic Claims Shadowing

A human-centered innovation team decided to shadow a handful of claimants. They observed one customer, an elderly woman named Helen, trying to submit a complex claim. The Small Data revealed the following Human-Scale Insight: Helen wasn’t confused by the interface; she was terrified of making a single, irrecoverable mistake that would void her payment.

  • The app’s clean, modern interface, which minimized text to look “sleek,” made her feel unsupported.
  • The phone call, despite the wait time, provided the emotional reassurance that a human was accountable for her process.

The Strategic Pivot: Designing for Emotional Safety

The strategic pivot was not a technical fix, but an emotional one. SecureCo unlearned the assumption that speed was the top priority. They redesigned the app to include a permanent, dedicated “Help Desk Chat” button staffed by a specific, named agent for complex claims. They introduced a feature that explicitly allowed the user to undo any step, assuring them that the process was safe. By focusing on the human fear of permanent error (Small Data), the company achieved a 75% digital adoption rate for complex claims within nine months, proving that emotion drives adoption.

Case Study 2: The SaaS Firm’s Enterprise Feature Failure

Challenge: Zero Adoption of a Flagship Enterprise Feature

A B2B SaaS company (“DataStream”) developed a powerful, highly complex “Advanced Analytics Module” for its largest enterprise clients. Despite being a required feature in high-cost contracts, Big Data showed near-zero usage. Usage logs confirmed that every user who clicked the module abandoned it within 30 seconds.

Small Data Intervention: “Desk-Side” Observation

The innovation team conducted in-person, desk-side observation with five key users at a major client. The Small Data analysis showed that the official reason for the product’s existence — “complex data correlation” — was not the user’s Job-to-be-Done. The users were highly stressed analysts who needed a quick snapshot to answer a simple, recurring question from their executive team: “Is this number trending up or down today?”

  • The Advanced Analytics Module required 15 clicks and 5 minutes to generate this answer (procedural friction).
  • The analysts were actually hiring a spreadsheet hack, a complicated but reliable 30-second shortcut they had built themselves.

The Strategic Pivot: The “Executive Answer”

DataStream performed a major strategic pivot, unlearning the notion that “more complex is more valuable.” They immediately launched an MVP dashboard called the “Executive Answer” (Stage 3). This dashboard, which used the same backend data, generated the required snapshot in a single click. The pivot was based entirely on observing five users and understanding their actual Job-to-be-Done. Usage of the original, complex module remained low, but usage of the new, Small-Data-driven dashboard became mandatory within all top-tier accounts, significantly improving client retention.

Small Data as the Change Fuel

Big Data provides the destination (e.g., “Grow revenue 15%”). Small Data provides the ignition — the human-scale insight needed to change course dramatically. Strategic change is often blocked by inertia and a fear of the unknown. By grounding a strategic pivot in a specific, observable human anecdote, leaders can create a compelling narrative that overcomes organizational resistance. The clarity and empathy derived from Small Data is the most potent fuel for Human-Centered Innovation.

“If Big Data is the map, Small Data is the compass that tells you the correct direction of travel.”

Frequently Asked Questions About Small Data

1. What is Small Data and how is it different from Big Data?

Big Data is aggregated, quantitative, and large-scale (the what and how many). Small Data is qualitative, contextual, and human-scale (the why and how). Small Data is collected through deep observation, ethnographic research, and in-depth interviews, focusing on a small number of users to gain deep, empathetic insights into their emotional and procedural friction points.

2. What is a “Human-Scale Insight”?

A Human-Scale Insight is a profound realization about user behavior, often revealed by Small Data, that exposes a latent or unmet need, emotional driver, or procedural friction point. This insight often reframes the “Job-to-be-Done” and is potent enough to drive a strategic pivot—changing not just how a product works, but why the company offers it.

3. Why is organizational “Humility” required to use Small Data effectively?

Humility is required because effective Small Data collection, like ethnographic immersion, demands that leaders and designers unlearn their existing assumptions about the customer and admit that the company may not understand the user’s true needs. It requires leaving the boardroom and observing the customer in their own environment, often revealing uncomfortable truths about product failure.

Your first step toward leveraging Small Data: Choose a product feature with low adoption, but high perceived value. Find three customers who stopped using it. Send a designer or product manager to spend 90 minutes observing them use a competitor’s product. Document the friction points, and use that Small Data to define a simple, empathetic MVP.

Extra Extra: Because innovation is all about change, Braden Kelley’s human-centered change methodology and tools are the best way to plan and execute the changes necessary to support your innovation and transformation efforts — all while literally getting everyone all on the same page for change. Find out more about the methodology and tools, including the book Charting Change by following the link. Be sure and download the TEN FREE TOOLS while you’re here.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.