Picture this: I’m an Army lawyer sitting in an airport lounge, fielding a challenging legal question from a concerned commander. Typically, my go-to response would be, “Have you checked the Commander’s Legal Handbook?” This comprehensive guide has saved me more times than I can count, offering straightforward answers to myriad legal issues. But there I was, far from my usual resources, facing a question I needed to look up myself.
So, I turned to a tool I trust as a reliable starting point, found a satisfactory answer and managed to assuage my commander’s concerns for the moment. What was this tool? Could it be Google, ChatGPT, Bing AI or even Claude? Does it matter, so long as it’s accurate, and I still operate within my professional and ethical obligations?
This example of a real-world dilemma prompted me to explore the role of artificial intelligence (AI) in military law. Though my initial attempt to ignite a dialogue about AI’s applicability within my professional community was met in the case of a legal publication with enthusiasm, skepticism and eventually, a shutdown, my findings were revealing. Tools like ChatGPT can answer basic military justice questions just as effectively as traditional resources like the Commander’s Legal Handbook.
Now I am a believer. I firmly believe that AI can be a powerful ally for military lawyers and their clients, a viewpoint that, unfortunately, remains in the shadows.
Simple Advice
The cornerstone of military legal practice always has been making complex laws and regulations accessible and understandable for commanders and service members. From publications like the Commander’s Legal Handbook to various legal training materials, judge advocate general (JAG) officers have always strived to distill complicated legal jargon into actionable advice.
These resources serve as essential guides for those who need immediate legal direction but may not have instant access to legal consultation. The Army augments these resources with accessible attorneys available at the unit level and trust commanders to augment online and print resources with genuine legal advice.
If the essence of JAG work is to provide comprehensive yet straightforward legal advice, then AI tools like ChatGPT are the next logical evolution of this practice. These AI systems are designed to do what JAG corps handbooks and guides do: offer immediate, accessible answers to legal questions.
Whether it’s understanding the nuances of the Uniform Code of Military Justice or clarifying rules of engagement, AI can deliver this information in an efficient and standardized manner. It’s not about replacing human expertise; it’s about augmenting it. AI can help JAGs fulfill their mission more effectively, making the law even more accessible to clients.
Yes, AI lacks the nuance and understanding of a trained legal mind. That’s a valid point, and I have proven it with my own exploration of AI products. But let’s not forget that the reference materials lawyers use also lack those nuances. They serve as starting points for legal inquiry, just as AI does.
If you are a lawyer and have used Google Search, you have used AI as a starting point for legal inquiry (and I know you’ve used Google). Therefore, when it comes to potential downsides like ethical concerns or lack of nuance, are these not issues people already navigate when using traditional resources? The key difference is that AI can deliver these answers more efficiently and in a more interactive manner, resembling a conversation more than static text on a page.
The Good and the Bad
While I advocate for the role of AI as an extension of JAG corps’ existing legal resources, I am not suggesting a wholesale switch or wholesale adoption. Any technological leap, especially one promising to revolutionize the way lawyers access and process legal information, comes with its own set of advantages and drawbacks. As practitioners committed to upholding the highest standards of legal and ethical conduct, JAGs must weigh the cons carefully against any perceived benefit. What JAGs should not do is shut down these necessary conversations.
Here are the potential benefits and potential downsides I foresee:
Potential Benefits
• Efficiency: Time is of the essence in military operations and legal decision-making. AI can drastically reduce the time it takes to find relevant regulations, case law or statutes, allowing military lawyers to focus on crafting the most effective arguments or advice.
• Consistency: AI can provide standardized answers to recurring legal questions, enhancing the uniformity of legal advice across command levels. This can be particularly beneficial for commanders who may not have immediate access to a JAG officer.
• Accessibility: With an AI tool, both lawyers and commanders have access to instant legal support, especially in remote locations where a legal adviser may not be readily available. This democratizes access to quality legal insights.
• Data-driven decisions: AI can analyze vast amounts of data to identify trends or anomalies that might be overlooked by human researchers, leading to more informed legal strategies and decisions.
Potential Downsides
• Ethical concerns: Relying on AI for legal decisions brings up significant ethical questions. Who is responsible if the AI gives incorrect or inappropriate advice? The operator? The software developer? The military institution?
• Lack of nuance: While AI can provide answers based on data and algorithms, it lacks the human elements of intuition and empathy, which are often critical in legal contexts, particularly in matters of military justice.
• Data security: Military law deals with sensitive and classified information. The use of AI opens another avenue for potential data breaches or misuse of information.
• Depersonalization: With the advent of AI, there’s a risk that commanders might rely solely on technology for legal advice, sidelining human expertise and potentially eroding the intricate relationship between legal advisers and military leadership.
My original attempt to explore these benefits and downsides was met with what I can now call a “mixed” response—enthusiasm about the technology, less so about potential implications.
I asked the generative AI program ChatGPT a simple commander-style question: “My subordinate soldier disrespected me; what should I do?” Its response was good, showing the benefits of using AI. It was a quick, smart answer I could have delivered with confidence to any commander while sitting in the airport lounge.
However, the response also showed the downside of using an AI model because it lacked some of the more careful, nuanced legal advice I might deliver if asked to prepare a written legal opinion.
Technological advancements like AI are inevitable, and resisting them entirely is not only futile, but it also could deprive military lawyers of valuable tools to make work more efficient and accessible. The military legal community’s task should not be to ban these advancements outright, but to thoughtfully integrate them into existing legal frameworks. Advancements in AI have not suspended our ethical and professional responsibility, and these bulwarks can handle the challenge.
Serving Justice
AI doesn’t have to be a replacement for human expertise; it can be a supplement, just like the Commander’s Legal Handbook or any other resource. While the potential downsides cannot be ignored, they are not insurmountable. JAGs should embrace the opportunity to lead the conversation on how AI can best serve military justice, not stifle it.
By shutting down conversations about the role of AI, JAGs risk staying anchored in the past, not safeguarding the future. Our duty as military legal practitioners is to navigate the present while also preparing for what’s next. And what’s next could be an AI-augmented legal practice that enables military lawyers to better serve those who serve.
So, let’s start the dialogue. Let’s not let fear stifle this professional evolution. Let’s instead approach the issue of AI in military law with open minds and constructive skepticism. That’s the only way we will find the path forward.
* * *
Maj. Trent Kubasiak is a judge advocate with the 10th Mountain Division and Fort Drum, New York. Previously, he was chief of military justice, 10th Mountain Division and Fort Drum. He deployed three times to Afghanistan and once to Kuwait. He has a JD from Marquette University School of Law, Wisconsin; an LLM from the Judge Advocate General’s Legal Center and School, Virginia; and an MBA from Capella University.