New AI Tool Writes Its Own Code

Open AI's new tool, Copilot, can analyze written code and generate new matching code.
Ameya Paleja
Image shows programming codePexels / Markus Spiske

GitHub and OpenAI have released a preview of their new AI tool, the GitHub Copilot. In a blog post. GitHub CEO, Nat Friedman, called the tool a pair programmer that can draw context from written code and suggest new functions to help programmers find new ways to address their problems and complete their work faster.

Conventionally, pair programmers are two people working on the same project to aid the early discovery of mistakes in the code and speed up the development process. With this development, GitHub wants one of the programmers to be its AI tool. This is the first major rollout from Open AI after Microsoft invested $1 billion in the non-profit organization that now aims for capped profits. Microsoft also owns GitHub, a software code repository that is hugely popular with programmers.  

Copilot works on the Open AI Codex, which is a descendent of GPT-3 (Generative Pre-trained Transformer 3), Open AI's language generating algorithm.  GPT-3 made global news when it demonstrated near-human writing skills in 2020, thanks to the insane number of parameters it uses to make connections between words, phrases, and sentences.

On its part, OpenAI Codex devoured the terabytes of code available on GitHub and English language and will now be able to suggest codes with ease. A commercial version of Copilot is expected to be launched in the next few months, just like GPT-3 became available for commercial licensing last year. Copilot is compatible with many languages but will work best with Python, JavaScript, TypeScript, Ruby, and Go, according to Friedman. 

Like with all things in technology, Copilot will inherit the biases of the code it analyzed, which will reflect in the output it generates. "GitHub Copilot may sometimes produce undesired outputs, including biased, discriminatory, abusive, or offensive outputs," its website says under Responsible AI. Biases were also noticed in GPT-3 outputs. GitHub claims it has included filters to block offensive words and avoid synthesizing suggestions in sensitive contexts. Real-world outputs will show us how effective these have proven to be. 

Users interested in trying out Copilot need to join this waitlist.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board