PhoGPT: Generative Pre-training for Vietnamese (2023)
-
Updated
Nov 12, 2024 - Python
PhoGPT: Generative Pre-training for Vietnamese (2023)
An autoregressive language model like ChatGPT.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
Inspired by Andrej Karpathy’s "Let’s Build GPT", this project guides you step‑by‑step to build a GPT from scratch, demystifying its architecture through clear, hands‑on code.
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
A pure Rust GPT implementation from scratch.
A implimentation of GPT2 varient.
Unlock the power of your PDFs with HackWES-PDFChatGenius! This innovative project, developed during the HackWES hackathon, transforms the way you interact with PDF documents. Simply upload your PDF, and engage in a natural conversation to extract information, ask questions, and gain insights from your documents.
This is a NLP coursework repository for the Honours Bachelor of Artificial Intelligence program at Durham College. This repository contains weekly labs, assignments, and the final project completed during the Winter 2024 term.
An Industrial Project about NLP in Finance Application
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes
An academic implementation of GPT: only math and raw JAX
PyTorch implementation of GPT from scratch
Repository for all things Natural Language Processing
(GPT-1) | Generative Pre-trained Transformer - 1
I built a GPT model from scratch to generate text
Repository for personal experiments
Add a description, image, and links to the generative-pre-trained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pre-trained-transformer topic, visit your repo's landing page and select "manage topics."