Image generated using ChatGPT from a concept developed by Alliance Enterprises.
AI 4.0: Learning the Craft Before Using the Tool
Artificial intelligence (AI) continues to shape vocational rehabilitation counseling. In our previous discussions, we explored what AI is, why it matters, and how it is influencing our work. At some point, however, every conversation about innovation must shift from understanding to application.
This is that moment.
As we move into this next series, the focus turns to practical use—what rehabilitation counselors can actually do with AI in their daily work. That journey begins more simply than many might expect, but not more easily. Like any professional skill, it requires intention, discipline, and a commitment to ethical practice.
Before taking those steps, we must ground ourselves in the ethical guidance already offered by our profession. Both the American Counseling Association and the Commission on Rehabilitation Counselor Certification have made expectations clear. Counselors using AI must protect confidentiality, maintain transparency, understand the limitations of these tools, avoid over-reliance, and—most importantly—keep the human firmly in the loop. AI may inform, but it does not decide. That responsibility remains with us.
With that foundation in place, I want to suggest two beginning steps: building skill and building a profile.
Step One: Learning the Skill
When I first began working with AI, I did not start with case notes, eligibility determinations, or vocational planning. I started somewhere much simpler—I played with it.
Over the past 18 months, I have learned from colleagues, attended workshops, taken online courses, and experimented. In many ways, the experience reminded me of learning vocational evaluation early in my career. I did not begin by administering assessments. I practiced with the tools. I learned their structure, their limitations, and their potential—long before they ever became part of my professional work.
AI is no different.
I asked questions about topics I enjoyed—hunting, gardening, travel, fishing, even cooking. I explored how it responded, what worked and what didn’t. At times, I even asked it to teach me how to use it. Through that process, I began to understand something important: AI is not intuitive like human conversation—it responds to how we engage with it. The better we become at interacting with it, the more useful it becomes.
Recently, I attended several workshops focused on AI in the public sector. In nearly every session, someone asked the same question: Which AI system should we use?
The answers were remarkably consistent. First, use the system your organization provides. Second, if no system is designated, use the one you can access most easily.
There is wisdom in that simplicity.
While different platforms have strengths and weaknesses, they all perform the core functions we need at this stage. The real variable is not the system—it is the skill of the user. As your experience grows, you will naturally begin to see which tools align best with your work.
In my own experience, I began with free versions of AI tools and found them helpful in learning the basics. Over time, I chose to experiment with paid versions. What I noticed was not a change in what the systems could do, but in the capacity—longer responses, more detailed analysis, and greater consistency.
I am not suggesting that a paid system is necessary. Skill development remains the most important factor. However, for those who find value in the tool and want to expand its use in professional practice, increased capacity can make a difference.
Step Two: Building a Professional Profile
Once you begin to develop comfort with AI, the next step is more intentional: teaching the system how to work with you.
AI does not begin with an understanding of rehabilitation counseling. It does not know your role, your ethical framework, or the population you serve. It certainly does not know your voice. All of that must be taught.
This is where building a profile becomes essential.
Most AI systems allow customization—preferences or instructions that shape how the system responds. In my own work, I began by establishing clear boundaries. I instructed the system that anything I produce must remain my own work. It may assist with research, offer suggestions, and improve clarity, but it cannot replace authorship. The risk of unintentional plagiarism is too great to ignore.
I then began to describe who I am professionally. I shared my roles, expertise, the populations I serve, and the values that guide my work. I identified my preferred language—terms such as consumer and participant—and emphasized the importance of dignity, inclusion, and opportunity.
I also spent time shaping how the system communicates. I asked for a tone that is warm, present, and grounded in storytelling. That matters more than it might seem. If AI is to support our work, it should not sound like a machine. It should reflect the human relationship at the center of rehabilitation counseling.
Finally, I defined the types of tasks I expected the system to support—writing, editing, research, and idea development. Each of these instructions helps the system begin to align with the way we think and work.
Ethics Within the Process
Even as we build skill and profile, ethical considerations remain constant.
Confidentiality must be always protected. Unless you are working within a secure, organization-approved system, no identifying consumer information should ever be entered. All examples must be anonymized. The responsibility to comply with the Health Insurance Portability and Accountability Act (HIPAA) and the Family Educational Rights and Privacy Act (FERPA) does not change because the tool is new.
Bias is another critical concern. AI systems are built on large datasets that often reflect societal assumptions, including those about disability. Left unchecked, these systems may underestimate potential, overlook accommodations, or default to traditional expectations that do not reflect the lived experience of the people we serve.
We must actively teach the system to think differently.
We must emphasize consumer choice, explore accommodations and supports, and push beyond the limitations of the “average” case. In doing so, we are not simply improving the output of AI—we are reinforcing the values that define our profession.
Foundation and What’s Next
This may feel like a great deal at the beginning. In truth, it is no different than learning any professional skill. We study the foundations, we practice deliberately, and over time, competence becomes confidence.
I am reminded of how we were trained as counselors. We began with theory—understanding approaches and philosophy. Then we moved into pre-practicum, where we practiced the skills, often imperfectly. Finally, we stepped into internship, where those skills became real in the lives of the people we served.
AI follows that same path. It begins with understanding, moves into practice, and only then becomes part of our professional work.
In this discussion, we have focused on two steps: learning the tool and building a profile. If done well, these steps create the groundwork for everything that follows. They position AI not as a replacement for professional judgment, but as a tool that can expand our capacity and improve how we serve others.
In our next conversation, we will take the next step—how to shape effective prompts for editing, research, and document creation. That is where the tool begins to take on practical form.
For now, the task is simpler. As a vocational rehabilitation counselor,
Learn the tool.
Teach it who you are.
And never forget who is responsible for the work.
*This blog has been edited with the assistance of ChatGPT
Note: Ethical guidance referenced in this blog is drawn from publications by the American Counseling Association (ACA) and the Commission on Rehabilitation Counselor Certification (CRCC). Readers are encouraged to review those materials for additional detail.

