7 min

The NLP Year in Review Tech On Trial

    • Technology

2022 has been a fun year in AI and NLP, and today we thought we'd take a minute and reflect back on all the things that have occurred. Some of the important things that have occurred in the world of NLP and a lot of the generative world of NLP in 2022.

GPT has been a big deal in NLP, Generative, Pre-trained Transformer is a generation technology to help us generate usually text content and that can really be like all kinds of things. If you go look on YouTube and you kind of like dabble around with GPT-3 OpenAI, you'll see examples of all kinds all the way from like, give me a tweet basically, you know, like generate a tweet from you, which is kind of interest.

So for a machine to generate and do it well and in within the context of what you want is actually really cool and really good. So this year, obviously there's been advances in GPT-3 they did allow for the editing and insertion of data, like human data back into the reports. So if you were to ask GPT to specifically generate something for you,  could actually embed new information in there to make it slightly better is a big deal.

Another thing that they've done since they noticed in the past that was happening is they've added a lot of plagiarism detection into their generation. They've done that with slightly better word parsing and quite a bit of synonym generation so that the content will be as unique as possible and avoid plagiarism.

Another really big thing done this year was to allow for the monitoring or ongoing review of data repositories that allow GPT to constantly scan and add new content as it becomes available and then make GPT's engine more relevant to your search, as well.

Tune in next week we're gonna talk a little bit about our roadmap for 2023 and what we see could happen in NLP next year.

Happy 2023 everyone!

2022 has been a fun year in AI and NLP, and today we thought we'd take a minute and reflect back on all the things that have occurred. Some of the important things that have occurred in the world of NLP and a lot of the generative world of NLP in 2022.

GPT has been a big deal in NLP, Generative, Pre-trained Transformer is a generation technology to help us generate usually text content and that can really be like all kinds of things. If you go look on YouTube and you kind of like dabble around with GPT-3 OpenAI, you'll see examples of all kinds all the way from like, give me a tweet basically, you know, like generate a tweet from you, which is kind of interest.

So for a machine to generate and do it well and in within the context of what you want is actually really cool and really good. So this year, obviously there's been advances in GPT-3 they did allow for the editing and insertion of data, like human data back into the reports. So if you were to ask GPT to specifically generate something for you,  could actually embed new information in there to make it slightly better is a big deal.

Another thing that they've done since they noticed in the past that was happening is they've added a lot of plagiarism detection into their generation. They've done that with slightly better word parsing and quite a bit of synonym generation so that the content will be as unique as possible and avoid plagiarism.

Another really big thing done this year was to allow for the monitoring or ongoing review of data repositories that allow GPT to constantly scan and add new content as it becomes available and then make GPT's engine more relevant to your search, as well.

Tune in next week we're gonna talk a little bit about our roadmap for 2023 and what we see could happen in NLP next year.

Happy 2023 everyone!

7 min

Top Podcasts In Technology

Lex Fridman Podcast
Lex Fridman
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Acquired
Ben Gilbert and David Rosenthal
The Neuron: AI Explained
The Neuron
BG2Pod with Brad Gerstner and Bill Gurley
BG2Pod
TED Radio Hour
NPR