As AI continues to develop and adoption becomes more widespread, the temptation to offload tasks we don't fully understand will get bigger and bigger.

I think being able to reach for AI's assistance to complete tasks is awesome, but I think we should make a point to avoid blind submission and get into the habit of asking our artificial friends "why?"

AI has the potential to accelerate learning like never before. But we have to ask why.

Why isn't this working?

Don't just ask mr. robot to do something for you without attempting it yourself. Try, fail, and ask why?

Then try letting the thing come up with its own solution, and ask why? Why does that work? What are your sources? Break the solution apart. How exactly does this function work?

Make AI your rubber duck. Explain your understanding to the robot and ask it if you're on the right track. But don't just take what it says as irrefutable truth. Check its sources. Make sure it's not summarizing a newspaper article from 1997 that is no longer applicable.

Since I can be pretty dumb sometimes, my favorite thing to tell AI to do is explain things to me like I'm five years old. And I've learned a lot by doing that (and checking its sources)!

Take advantage of artificial intelligence, but don't let your intelligence become artificial.