Glossary
LLM Collection
Adversarial Machine LearningAI AgentsAI and EducationAI and MedicineAI AssistantsAI HardwareAI Voice TransferApproximate Dynamic ProgrammingBackpropagationBayesian Machine LearningCurse of DimensionalityData LabelingDeep LearningDeep Reinforcement LearningDimensionality ReductionF1 Score in Machine LearningFeedforward Neural NetworkFine Tuning in Deep LearningGated Recurrent UnitGenerative AILarge Language Model (LLM)Machine LearningMultimodal LearningNatural Language Generation (NLG)Natural Language Processing (NLP)Natural Language Understanding (NLU)Precision and RecallTransformersZero-shot Classification Models
AlphaGo ZeroBERTChatGPTChess botsDall-EDiffusion ModelsDistilBERTGoogle's BardHidden Markov Models (HMMs)Inference EngineLarge Language Model (LLM)Llama 2LLM CollectionMidjourney (Image Generation)MistralOpenAI WhisperPerceptronProbabilistic Models in Machine LearningRoBERTaRule-Based AISpeech-to-text modelsText-to-Speech ModelsXLNet
Activation FunctionsBatch Gradient DescentBeam Search AlgorithmBenchmarkingClassificationCURE AlgorithmExpectation MaximizationFlajolet-Martin AlgorithmGaussian ProcessesGenerative Adversarial Networks (GANs)Gradient Boosting Machines (GBMs)Gradient ClippingGradient ScalingGroundingHyperparametersk-ShinglesLatent Dirichlet Allocation (LDA)Mixture of ExpertsMultimodal AIOnline Gradient DescentOverfitting and UnderfittingPrompt ChainingPrompt EngineeringRetrieval-Augmented Generation (RAG)RLHFSentiment AnalysisSequence ModelingSemantic KernelTokenizationWinnow AlgorithmWord Embeddings
Last updated on January 18, 20241 min read
LLM Collection
The table below includes some of the most influential models, and the order is sorted by their release dates.
Model | Release Date | Developer | License | Description |
---|---|---|---|---|
GPT-4 | 2023 | OpenAI | Custom | Successor to GPT-3, built on a similar architecture but with improvements. |
GPT-3 | June 2020 | OpenAI | Custom | 175 billion parameters, known for its versatility and capability. |
Turing-NLG | February 2020 | Microsoft | Custom | 17 billion parameters, aimed at natural language understanding and generation. |
GPT-2 | February 2019 | OpenAI | Modified MIT | Initially withheld from public release due to concerns over potential misuse. |
BERT | October 2018 | Apache 2.0 | Designed to understand the context of words in search queries. | |
Transformer XL | January 2019 | Google/CMU | Apache 2.0 | Extended Transformer model to handle longer sequences of text. |
GPT | June 2018 | OpenAI | Modified MIT | First Generative Pre-trained Transformer with 117M parameters. |
ELMo | March 2018 | Allen Institute | Apache 2.0 | Deep contextualized word representations, allowing for rich word meanings. |