Artificial Intelligence in VR Counseling: Powerful Tool, Not a Substitute for Judgment

One of the most frequent requests we receive as a case management software company is, “How can we integrate artificial intelligence (AI) into our case management system?” At conferences across the country, I’ve noticed that the sessions drawing the biggest crowds are centered on AI. It’s often viewed as a sort of magic wand—promising better time management, greater efficiency, and improved outcomes for consumers.

But as appealing as the promise of AI may be, especially in the context of rehabilitation counseling, there are critical considerations before integrating it into practice or your case management system. AI certainly holds value for tasks like scheduling, participant engagement, career exploration, labor market insights, and drafting case notes. Yet we must also weigh substantial risks—to the agency, to the counselor, and most importantly, to the individuals with disabilities we serve.

Let’s take a deeper look at those risks and the essential questions vocational rehabilitation (VR) counselors should ask when considering the use of AI.

Understanding the Risks

AI systems typically operate as public platforms. When information is entered into many commercial AI tools, it may be stored or processed in a way that makes it publicly accessible or retrainable. That poses a clear risk of violating HIPAA and FERPA regulations.

There are two ways to mitigate this risk:

  1. Use anonymous input—never entering identifiable participant information.
  2. Implement enterprise or private AI systems—which operate securely within protected networks and do not share or learn from sensitive user data.

Algorithmic Bias and Exclusion

AI models are trained on massive datasets, often shaped by societal patterns and prejudices. This can lead to systemic bias—underestimating capabilities or overgeneralizing outcomes.

Consider the classic bell curve: most data points cluster near the center, with fewer outliers on either end. AI tends to favor what’s “typical,” which means individuals with disabilities who fall outside conventional norms may be overlooked, misjudged, or limited in the options they’re presented.

The result? A false ceiling on potential. Predictive analytics can inadvertently overestimate limitations or underestimate possibilities, leading to missed opportunities for growth and independence.

Impact on Informed Choice and Counselor Judgment

Informed choice is a foundational principle in VR services. Yet if a vocational rehabilitation counselor relies too heavily on AI-generated recommendations, they may unintentionally override or narrow the participant’s voice. The technology, while helpful, is not a neutral tool—it’s shaped by data, not by dreams, goals, or lived experiences.

There’s also a creeping risk of professional detachment. The ease of using AI can tempt counselors to lean on it in place of their own clinical judgment and intuitive understanding of people. This diminishes the power of the therapeutic relationship—a core element of effective counseling.

Carl Rogers once said, “The curious paradox is that when I accept myself just as I am, then I can change” (Rogers, 1961, p. 17). That kind of acceptance—empathic, human, and nonjudgmental—can’t be coded into a machine.

Key Questions Before You Use AI

If you’re considering integrating AI into your counseling practice or agency case management system, ask yourself three guiding questions:

  1. Is it legal?

Many states have enacted or are exploring legislation that limits or regulates AI use in health and human services. Federal agencies have also weighed in with varying degrees of oversight. Before using AI in any part of your practice, verify your agency’s standing. What is permitted? What is restricted?

  1. Is it within policy?

Most organizations are in the process of defining their AI policies. Have you reviewed yours? Does it clarify what is allowed, how data is handled, and how counselors should be trained?

Your ethical compass should be calibrated not only to legality but to your agency’s policies and expectations.

  1. Is it ethical?

This is the most expansive question—and often the most overlooked. Just because something is legal doesn’t mean it aligns with our profession’s ethical standards.

The Commission on Rehabilitation Counselor Certification (CRCC) Code of Ethics (2023) may not yet mention AI by name, but its core values offer clear guidance.

The six fundamental principles of ethical practice are:

  • Autonomy – Respecting the right of individuals to be self-governing
  • Beneficence – Acting in the best interest of the participant
  • Fidelity – Being trustworthy and loyal
  • Justice – Ensuring fairness and equity
  • Nonmaleficence – Doing no harm
  • Veracity – Being truthful and accurate

Each of these has direct relevance to the use of AI:

  • Fidelity and nonmaleficence urge us to protect participant data.
  • Autonomy, beneficence, justice, and veracity call us to maintain informed choice, avoid algorithmic bias, and represent consumers fairly and truthfully in documentation.

The CRCC Code also speaks to:

  • Section B.6 – Accurate recordkeeping to facilitate continuity of services
  • Section E – Professional competence and working within your area of training
  • Section K – Appropriate and secure use of technology

If you’re not trained in AI, its application may fall outside your scope. If you are using it, are you doing so in a way that upholds these ethical standards?

Where AI Adds Value

When used legally, ethically, and intentionally, AI can be a tremendous asset to vocational rehabilitation. It should augment, not replace, counselor judgment.

Here’s where it can help:

  • Career exploration: A first step to spark options—not to dictate them.
  • Labor market insights: Understanding employer needs and regional trends.
  • Drafting documents: Case notes, plans, reports—improving clarity and accuracy.
  • Accommodation planning: Identifying assistive technology, when paired with counselor expertise.
  • Efficiency: Scheduling, email writing, and contact tracking—freeing counselors to spend more time with participants.
  • Performance analytics: Helping agencies understand trends and outcomes.
  • Reducing screen time: More participant face-time, less computer facetime.

Conclusion: Leading with Wisdom, Not Hype

Artificial intelligence is not magic—it’s a mirror of the data we feed it and the priorities we assign to it. In vocational rehabilitation counseling, AI has the power to streamline our work and support our decisions—but it must never override our professional responsibilities, ethical obligations, or human connection.

As we navigate this evolving landscape, we must ask not just what AI can do, but what it should do. We must remain vigilant, informed, and grounded in our mission: empowering individuals with disabilities to pursue meaningful, self-directed lives.

Used wisely, AI can be a powerful ally. But only if the counselor remains in the lead.

Reference

Commission on Rehabilitation Counselor Certification. (2023). Code of Professional Ethics for Certified Rehabilitation Counselors. Schaumburg, IL: Author.

Rogers, C. R. (1961). On becoming a person: A therapist’s view of psychotherapy. Houghton Mifflin.

*Blog developed with assistance of ChatGPT for editing.