Get Started
Log In
Menu
Get Started
Log In

If AI had 23andMe...

by Lauralee Dhabhar, on Feb 13, 2023 11:56:44 AM

ChatGPT meets GOSTCrawl

Genetic testing and ancestry services first became available in 2000. Since that time, over 26 million people have spit and swabbed their way to long-lost relatives and surprising genetic heritages. As humans, we need connections. We want to know that we belong to a greater community, and genetic testing gives us ultimate proof of our connection to the world.

But what does this have to do with artificial intelligence? Well, nothing, really. AIs obviously don’t contain genetic code. But they do contain specific types of building blocks as well as training methodologies. While two different AI tools make function very differently, they may indeed be “related” on the basis of their fundamental structures and early training methods. Just as it would be exciting to discover Ryan Reynolds was your long-lost cousin, we at Giant Oak (outside of our science and engineering team who knew all along) were joyously wide-eyed to discover that a fundamental component of GOST claimed a close relation in ChatGPT.

ChatGPT is the viral state-of-the-art Large Language Model storming the world, but its unassuming cousin is at work deep in the bowels of Giant Oak’s proprietary web crawler, GOSTCrawl. GONER (Giant Oak Named Entity Recognition) is currently the most advanced, targeted natural language processing tool on the market and is only one of the components that drive GOSTCrawl.

What makes it the most advanced? How is it related to an AI known for being able to write books and answer almost any question on your mind? It all comes down to AI fundamentals.

ChatGPT and GONER are both capable of a significantly complex task: digesting, processing, and defining very large blocks of text. Ultimately, they have been trained to use this skill differently. This differentiation comes during the final steps in AI development called fine-tuning. During fine-tuning, creators use several processes, including human feedback, to train a model to use its skills for a specific task. Before this step, the AI must be built and then trained for its designated task. These fundamental stages are where ChatGPT and GONER grew up together.

Transformers

transformerBoth ChatGPT and GONER’s building blocks contain transformers (no, not Autobots or Decepticons). Transformers, created in 2017, are neural networks that use a mechanism called self-attention to connect every input to another input. In other words, this is what allows the AI to understand the context of words, and how their arrangement changes meaning and allows for them to decipher language following the methodology of the human brain.

Transformers are not an attempt to recreate the human brain but to allow AI to process language in a way that makes sense to humans. It is these building blocks that allow models like ChatGPT and GONER to not only read and categorize human language but to understand it.

Training

An AI is only as good as its training. So, how do we train these models to make the best use of their programmed abilities? In this case, pre-training is driven by the transformers and expansion of the self-attention mechanism. Both ChatGPT and GONER use multi-headed self-attention. In this case, the inputs are examined from different angles and transformed to allow for better understanding. During the initial stages, models are fed vast amounts of data. As the model digests each set of data, it learns the patterns and relationships between words that create language. This can be equated to learning to read. First, a person is introduced to the alphabet and the sounds of each letter, then words, and then over several years, we learn how to implement a set of ambiguous rules (transformation) to create meaning out of those words. Fortunately, in the case of AI, the difference between there, their, and they’re is grasped exponentially faster.

Multiheaded self attentionv2

Multiheaded self attentionv3

At this point, ChatGPT and GONER look very similar. Both are spectacular linguists without jobs. To put these models to work, they must be fine-tuned. Fine-tuning is further training that allows a model to apply its skills to a specific task. It is here that ChatGPT and GONER start to look very different. If you have tried ChatGPT, you know it is designed to generate answers based on questions inputted into its chatbox. A user can have direct interaction with ChatGPT. GONER, however, is far more modest in its approach to helping users. Its job is to read and quickly understand vast amounts of information and direct GOSTCrawl to what is most important for its users' requirements. Through a series of methodologies, including human feedback, these models are taught to do their jobs with the highest accuracy and the minimum bias.

GOST users will never see GONER. They will never ask it questions. But GONER will always be sitting in the background making sure you are served with the information you need. It is ChatGPT’s humble cousin.

Learn more about our latest innovations

Topics:BlogArticle

Recent Posts