Reflection Tools

You use AI every day. How well do you know what you bring to it?

Most people can describe what AI does. Few can describe what they do with it: the assumptions they carry, the patterns they fall into, the strengths they underuse.

When MIT Sloan researchers studied what determines AI output quality, they found that half depends on the model and half depends on the user. Not the prompt. The user: their mental model, their judgment about when to trust and when to override, their interaction habits.

That finding sits alongside a harder one. Vaccaro et al. (2024) analyzed 106 studies and found that human-AI teams perform worse than the best of either alone, on average. Complementary performance is possible. It just requires specific capacities that most people have never measured.

These tools make the invisible visible. They are grounded in peer-reviewed research, designed for people who already use AI thoughtfully, and want to understand their patterns with more precision.

See yourself

Four self-report tools that map how you perceive AI, how you interact with it, what it does to your thinking, and where you are on the learning curve. Start anywhere.

Perception

How do you see AI?

Rate AI on eight dimensions and discover which attitude cluster shapes how you interact with it.

~3 min
Style

What's your prompting style?

Work through realistic scenarios and see whether you lean directive, conversational, or co-creative.

Coming soon
Reflection

Better or worse thinker?

Reflect on how AI has changed your thinking habits and whether it is augmenting or replacing your judgment.

Coming soon
Growth

What type of AI learner?

Find out where you are on the AI learning curve and what to focus on to keep growing.

Coming soon

Test yourself

The tools above ask how you see yourself. These five ask how you perform. They present real AI outputs, real capability questions, real trust decisions, and compare your responses to empirical benchmarks. The gap between self-perception and reality is where the most useful insights live.

Available through advisory engagement or workshop participation.

Calibration

Where is AI's edge?

Predict how well AI handles specific tasks, then see how your intuition compares to reality.

Calibration

Can you spot the flaw?

Evaluate real AI-generated outputs and find out how well you distinguish accurate from subtly flawed.

Mindset

What are you afraid of?

Identify which specific anxiety shapes your caution around AI, from job concerns to learning fears.

Calibration

You or AI?

Compare your judgment against evidence on who does what better, you or the model.

Calibration

When do you trust AI?

Decide whether to follow or override AI recommendations and see if your trust is well-placed.

More tools coming. Leave your email to be the first to know.

Reflection is the starting point. The patterns these tools reveal are the same ones explored in the essays and developed through practice in the program.