🦞🌯 Lobster Roll

Thread

Extrapolating to Unnatural Language Processing with GPT-3’s In-context Learning: The Good, the Bad, and the Mysterious (ai.stanford.edu)

Stories related to "Extrapolating to Unnatural Language Processing with GPT-3’s In-context Learning: The Good, the Bad, and the Mysterious" across the full archive.

Extrapolating to Unnatural Language Processing with GPT-3’s In-context Learning: The Good, the Bad, and the Mysterious (ai.stanford.edu)
Natural Language Processing for the Working Programmer (nlpwp.org)
GODISNOWHERE: A look at a famous question using Python, Google and natural language processing (ileriseviye.wordpress.com)
FAUST: A language for real-time sound processing and synthesis (faust.grame.fr)
An Open Source Wit.ai Alternative for Natural Language Processing (source.id.hn)
LexVec: State-of-the-art natural language processing (github.com)
How To Get Into Natural Language Processing (blog.ycombinator.com)
Natural Language Processing with Small Feed-Forward Networks (arxiv.org)
Natural Language Processing with Python updated for Python 3 and NLTK 3 (nltk.org)
A Guide to Natural Language Processing (tomassetti.me)
Speech and Language Processing, 3rd Edition (web.stanford.edu)
Minimalist movie posters generated using Processing programming language. (mad4j.github.io)
Natural Language Processing is Fun (medium.com)
Hardware Acceleration for Unstructured Big Data and Natural Language Processing (deepblue.lib.umich.edu)
Abstract: "In this thesis, we present a set of hardware accelerators for unstructured big-data processing and natural language processing. The first accelerator, called HAWK, targets unstructured log processing and fits within the context of string search. HAWK consumes data at a fixed 32 GB/s,...
Building a Scalable and Language Agnostic Data Processing Pipeline (turbolent.com)
Natural Language Processing: the age of Transformers (blog.scaleway.com)
NLP via BERT
Dex: Research language for array processing in the Haskell/ML family (github.com)
OpenAI releases the largest version (1.5B parameters) of their GPT-2 language model, along with code and model weights (openai.com)
GPT-3: Language models are few-shot learners (arxiv.org)
Quantum Natural Language Processing (medium.com)
Apologies for linking medium, I could not find an alternative source which was also as high level and as comprehensive as this post. I’d be happy to update the url if you have one
The Ultimate Guide to OpenAI's GPT-3 Language Model (twilio.com)
The Illustrated GPT-2 (Visualizing Transformer Language Models) (jalammar.github.io)
Text completion using the GPT-2 language model (bellard.org)
Speech and Language Processing (web.stanford.edu)
Creating a GPT-Style Language Model for a Single Question (unite.ai)
Natural Language Processing for Icelandic with PyPy: A Case Study (pypy.org)
Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models (cerebras.net)
OpenAI's GPT-3 Language Model: A Technical Overview (lambdalabs.com)
Think of language models like ChatGPT as a “calculator for words” (simonwillison.net)
ChatGPT is cutting non-English languages out of the AI revolution (wired.com)