Jump to content

ChatGPT: AI Responses to Common EB Questions


Recommended Posts

  • 2 weeks later...

Brian, I'm impressed and not impressed. Basically, all I've seen so far from ChatGPT is intelligent cutting and pasting of what is out on the internet. Granted, it's a real achievement for it to figure out what information is relevant to the question and to scrape it from the internet and cut and paste it into an intelligible answer, but this is newsletter type stuff, not an actual solution to a hard problem. I read the NY Times article on ChatGpt a week ago and the example that really struck me was the algebra question. The NY Times article linked to a post on Twitter: "A line parallel to y = 4x + 6 passes through (5, 10). What is the y-coordinate of the point where this line crosses the y-axis?" Chat GPT begins by explaining the problem and that we need to find a parallel line and do some algebra, about as well as a middle school math teacher at the chalkboard, but then boldly spits out a humorously wrong answer.

ChatGPT is just regurgitating, cleverly to be sure, the stuff it scrapes off the internet.

Try asking it one of the more difficult questions that you have received on BenefitsLink over the last year and see what you get.

Luke Bailey

Senior Counsel

Clark Hill PLC

214-651-4572 (O) | LBailey@clarkhill.com

2600 Dallas Parkway Suite 600

Frisco, TX 75034

Link to comment
Share on other sites

Yeah my point was a week or two in it's pretty decent at providing decent answers to basic EB questions.  Imagine a year or two in?  Or a decade?

I would guess that clients will be more interested in what the bot has to say than our input over that kind of horizon.  Or at least we'll constantly be double-checked and confronted by any differences in the AI analysis.

Link to comment
Share on other sites

2 hours ago, Brian Gilmore said:

Imagine a year or two in?  Or a decade?

That's what they were saying about self-driving cars a decade ago, Brian. And that is a much lighter lift.

2 hours ago, Brian Gilmore said:

I would guess that clients will be more interested in what the bot has to say than our input over that kind of horizon.

I'd be willing to bet on that, Brian.

 

2 hours ago, Brian Gilmore said:

Or at least we'll constantly be double-checked and confronted by any differences in the AI analysis.

Will double the work.

As I told my son, who has done AI research for one of the major software companies and is currently finishing up law school, and who tested ChatGPT for legal research and thought it would be helpful to folks fresh out of law school as long as they checked each answer and only used ChatGPT as a start (which I agree with), if someone had told me that 20 years ago there would be something out there like ChatGPT (or  Google or Microsoft or Apple translate) that could so flawlessly mimic the workings of a mediocre human mind, I would not have believed it. I would have thought that language was too complicated. On the other hand, I fully expected 20 years ago that by now we would have AI that could diagnose illnesses like House . To my chagrin, both of those predictions were wrong.

Luke Bailey

Senior Counsel

Clark Hill PLC

214-651-4572 (O) | LBailey@clarkhill.com

2600 Dallas Parkway Suite 600

Frisco, TX 75034

Link to comment
Share on other sites

  • 2 weeks later...

It's important to note that GPT is a machine learning model, which means that it uses statistical techniques to learn from the data it is trained on. As a result, the quality and characteristics of the training data can have a significant impact on the performance of the model.
In future we can see better results - I hope

Link to comment
Share on other sites

On 12/20/2022 at 10:14 AM, Luke Bailey said:

I would guess that clients will be more interested in what the bot has to say than our input over that kind of horizon.

I am reminded of my conviction that we got section 409A as a consequence of “consultants” claims that our advice/interpretation about nonqualified deferred compensation rules was too conservative.

The quoting function is illustratively mechanical in attributing Brian Gilmore’s statement to Luke Bailey.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...