3 Comments

Let's set aside the term 'rudimentary.' We've been using advanced AI for many years. Think about it: When did Apple start enhancing our photos? How long has Google been mining data to improve its search function? YouTube has been using deep learning to recommend videos since 2008. And Netflix? They've applied machine learning to enhance customer experience since the days when they were mailing out DVDs.

The future is indeed unfolding, but it didn't start with ChatGPT in 2023. It's an ongoing journey, continuously moving forward. As you mentioned, there's no stopping it. However, we can learn to leverage it to refine our writing and publishing weaknesses.

Expand full comment

Yup :) You got me. We've been using machine learning for decades.

I didn't say this because it felt out of scope for this article, but I would argue that every computer program is technically a form of AI. When a programmer writes code, he/she is actually turning their intelligence into written format for the computer to follow. The computer is using encoded human intelligence.

From that opinion, I don't know where 'rudimentary' starts or ends, honestly. Machine learning is definitely more advanced than me writing code, but is it AI? It is itself code written by humans, so … maybe?

Like you said, it's on ongoing journey, not a definitive "now we have arrived at advanced AI" moment.

It'll be interesting to see how it advances, and when people say "this is actually AI now."

Expand full comment

Agreed!

Thanks for the follow up!

Expand full comment