08 September 2025

Why I Do This: Training Companies to Use AI — Not Just Admire It

Over the last few years, I’ve worked as a tutor / trainer for a broad spectrum of organizations — from small startups with 5 people to large enterprises with hundreds or thousands of employees. The journey of taking a company from “What is that AI thing, again?” to “We deploy models, use AI in daily workflows, and trust our data-driven decisions” is rarely smooth — but when it works, it creates real, lasting value.

And that’s why I always tell my clients: AI isn’t magic. It’s a people skill, a culture shift, and a process investment. You can’t just license a fancy tool, flip a switch — successful adoption begins (and ends) with training people.

Here’s how I structure that journey — and why companies who treat AI training as a core part of their growth strategy end up ahead.


Getting Everyone Speaking the Same Language: AI Literacy First

One of the first things I do, regardless of company size, is run an “AI literacy bootcamp.”

  • Every team — engineering, sales, marketing, operations, HR — gets the same foundational overview of what AI/ML is, what it isn’t, what it can realistically do today, and where it often fails. This includes basic concepts (data, models, predictions, inference, bias, limitations).
  • The goal: build a shared baseline understanding so nobody confuses hype with reality. This levels the playing field — people stop feeling intimidated or skeptical, and instead start seeing concrete possibilities.

Why does this matter? Because if only a few “techie” folks understand AI, then adoption remains siloed, projects stall, and broader teams don’t buy in. I’ve seen many companies waste money that way.

Having everyone fluent in basic AI concepts also helps break down resistance. When non-technical staff realize AI isn’t “some black-box magic,” but a set of tools — they start asking useful questions, contribute ideas, and even influence business-relevant use-cases.


Tailoring Training to the Company’s Business Needs — Not Just Teaching ML for ML’s Sake

It’s tempting (and common) for companies to say: “We want everyone trained in machine learning.” But in reality, what they often need is not everyone to be a data scientist — but people who know how AI applies to their role.

When I onboard a client, I first lead a workshop:

  1. We map current workflows, pain points, and opportunities.
  2. We ask: Which tasks could AI realistically improve? Where is data abundant? Where is human judgement critical?
  3. We set clear, measurable goals: e.g. “reduce report-preparation time by 50% using ML,” or “use AI to triage support tickets and reduce average response time to under 2 hours.”

Then — and only then — I design a custom learning pathway. For example:

  • For marketing teams: brief sessions on generative-AI tools for content creation, customer-segmentation models, AI-driven A/B testing.
  • For operations: training on data-analysis tools, basic predictive modeling, anomaly detection workflows.
  • For engineering/data teams: deeper modules on model training, evaluation, versioning, deployment, monitoring.

The advantage: companies end up learning what they need, not what seems neat or trendy.

This approach aligns with research showing that upskilling works best when it’s tied directly to business-aligned goals, not generic “AI literacy” ambitions. Udemy Business+2Forbes+2


Building a Culture of Learning — And Reducing Fear of AI

A recurring challenge in my experience: many employees — especially non-technical — are fearful. Not of learning AI, but of what it might mean for their jobs.

To combat that, I focus on psychological safety and empowerment:

  • I present AI as a tool to amplify human effort, not replace it — helping people do more, faster, or with better precision.
  • I include hands-on, low-stakes workshops: sandboxed environments where people can experiment with AI tools (e.g. generative text tools, simple data-analysis dashboards) without fear of messing up real projects.
  • I encourage peer learning and collaboration: people from different teams share their ideas, successes, and concerns. The goal: create internal “AI ambassadors” — staff who get excited and lead small-scale experiments.

This social / cultural side of training is critical. According to some of the best practices for organizational AI adoption, fostering willingness to learn and giving people autonomy in their learning journey is a key success factor. BCG Global+2Hyperspace+2


From Training to Practice: Embedding AI into Daily Workflows

Training alone won’t transform a company — integration does. That means after the training sessions, I work with leadership and individual teams to embed AI tools into actual workflows.

This often involves:

  • Defining where AI can have quick wins (e.g. automating data cleaning, report generation, content drafts, ticket triage).
  • Setting up pilot projects to test AI-driven workflows.
  • Creating governance guidelines: how to use AI ethically, how to validate model outputs, when to use human oversight.
  • Training on maintenance and monitoring, especially if models are deployed in production: someone (or a team) needs to own evaluation, retraining, data drift detection, performance tracking.

Embedding AI as part of “business as usual” — not a one-off experiment — helps ensure value is sustained.

This approach follows what many experts recommend: treat AI adoption as a change-management problem, not just a tech upgrade. Bessemer Venture Partners+2McKinsey & Company+2


The Long Haul: Continuous Learning & Adaptation

One of the biggest mistakes I see: companies treat AI training as a one-time event. They run a few workshops, check “done,” and move on. Inevitably, that leads to stagnation.

Instead, I encourage clients to view AI upskilling as continuous, for these reasons:

  • AI evolves fast. New tools, better models, shifting best practices — what was cutting-edge a year ago may already be outdated. Staying static means falling behind.
  • Business needs evolve. New challenges, changing markets, new data sources — companies need agility. A workforce trained only once loses relevance quickly.
  • Scale and growth. As companies get bigger, more employees join; new departments added — all need onboarding. Having a repeatable training infrastructure (maybe a “training pipeline,” not just a workshop) is vital.

The organizations that thrive — in my experience — are those that commit to building a culture of learning: regular refresher courses, internal knowledge-sharing sessions, AI hackathons, open forums for experimentation, and evolving governance.

This aligns with modern thinking: successful AI upskilling is less about a one-off training push and more about building organizational resilience through ongoing education and adaptabilityd2l.com+2McKinsey & Company+2


What I’ve Learned as a Trainer: Mistakes, Pitfalls & What Works

After hundreds of training engagements, some patterns stand out (and some lessons hard-earned).

Mistake 1: Starting with technology, not problems.

Companies often ask: “Teach us machine learning.” But without a clear sense of why — what problem they are trying to solve — training becomes academic. Worse: models are built for the sake of building models, but never see production.

Mistake 2: Training only technical staff.

If only engineers or data scientists are trained — while sales, marketing, operations remain untouched — adoption stalls. Many opportunities happen outside code: think content generation, customer insights, process automation, or decision support. Skipping non-technical teams leaves value on the table.

Mistake 3: Treating AI as a “set and forget.”

Without maintenance, monitoring, and governance — AI deployments decay. Data drifts, models degrade, hype defeats reality. Trainings that fail to emphasize long-term upkeep waste time and money.

What works:

  • Start with use cases and business value. Let problems drive training.
  • Build cross-functional learning paths, tailored to roles, needs, and technical comfort.
  • Foster experimentation and psychological safety — let people play, fail, try again, invent.
  • Invest in governance, ethics, and maintenance, not just model-building.
  • Promote ongoing learning, not single-event training; embed AI literacy into company culture.

Why This Matters Now — and What’s at Stake

AI isn’t a niche tool anymore. As more companies — big and small — explore automation, insight generation, predictive analytics, generative content, decision support, the gap between those who use AI effectively and those who don’t is becoming a competitive divide.

  • Companies that invest in training gain productivity, agility, insight and scale. Employees spend less time on repetitive tasks, more time on strategic and creative work.
  • Organizations that ignore training—or treat it as a checkbox—risk wasted investments, failed projects, or even worse: decision-making based on flawed or outdated AI outputs.
  • Importantly: as automation rises, ethical use of AI becomes critical. Companies that instill responsibility, transparency, and human-centric oversight in training are better positioned to avoid bias, compliance issues, and reputational risk.

In short: training your people is not optional — it’s foundational. And done right, it’s what turns AI from a shiny experiment into a sustainable advantage.


Final Thoughts: I Don’t Just Build Models — I Help Build Capable Teams

When I look back at the companies I’ve worked with — some small, some massive — the ones that succeed are never the ones that splurged on the fanciest tech. They’re the ones that:

  • started with humility;
  • invested in people;
  • recognized that AI is a tool, not a magic wand;
  • committed to continuous learning, not one-off training;
  • integrated AI into workflows, culture, and decision-making.

As a trainer, my goal is never to create AI experts at every level. My goal is to build AI-capable organizations — teams where AI is understood, used responsibly, and evolves with the business.

Because at the end of the day, AI’s true power doesn’t lie in algorithms or compute — it lies in people who know how to use it.