GitHub Copilot for Business is Twice the Price
Naughty or nice, Microsoft says Enterprise should pay twice the price.
Hey Guys,
GitHub has launched Copilot for Business. The difference? GitHub Copilot for Business is officially here with simple license management, organization-wide policy controls, and industry-leading privacy—all for $19 USD per user per month.
GitHub Copilot, GitHub’s service that intelligently suggests lines of code, is now available in a plan for enterprises months after launching for individual users and educators.
Microsoft is being so entirely generous in its AI-as-a-Service subscriptions. Called GitHub Copilot for Business, the new plan, which costs $19 per user per month, comes with all the features in the single-license Copilot tier along with corporate licensing and policy controls. That includes a toggle that lets IT admins prevent suggested code that matches public code on GitHub from being shown to developers, a likely response to the intellectual property controversies brewing around Copilot.
"You can easily set policy controls to enforce user settings for public code matching on behalf of your organization," explains Shuyin Zhao, senior director of product management, in a blog post.
Microsoft is certainly fond of subscriptions payments for its enterprise products. When will they start charging for Bing ChatGPT access anyways? Disrupt Google for just $6.99 a month guys!
So why the extra $9 for corporate users? There is this feature, a public code filter, that is already available to individual users, who pay $10 per month for Copilot's AI help. But for corporate accounts, control over this filter belongs to the IT administrator. That’s a nice administrative add-on there Microsoft!
GitHub has turned into such a benevolent tool after all:
Microsoft, GitHub, and OpenAI are being sued for allegedly violating copyright law by reproducing open-source code using AI. But the suit could have a huge impact on the wider world of artificial intelligence.
No mercy no malice in the era of monopoly capitalism and btw, anything goes!
Available as a downloadable extension for development environments, including Microsoft Visual Studio, Neovim and JetBrains, Copilot is powered by an AI model called Codex, developed by OpenAI, that’s trained on billions of lines of public code to suggest additional lines of code and functions given the context of existing code. Copilot — which had over 400,000 subscribers as of August — can surface a programming approach or solution in response to a description of what a developer wants to accomplish (e.g., “Say hello world”), drawing on its knowledge base and the current context.
GitHub Copilot, introduced in 2021 as a Visual Studio Code editor extension, offers coding suggestions and functions directly from the user’s programming editor or IDE. The AI model behind Copilot is trained on open source code in public repositories.
Microsoft Alleges it’s Above the Law
GitHub claims that fair use — the doctrine in U.S. law that permits the use of copyrighted material without first having to obtain permission from the rights holder — protects it in the event that Copilot was knowingly or unknowingly developed against copyrighted code.
Microsoft, its subsidiary GitHub, and its business partner OpenAI have been targeted in a proposed class action lawsuit alleging that the companies’ creation of AI-powered coding assistant GitHub Copilot relies on “software piracy on an unprecedented scale.” Microsoft denies this to be the case.
The Free Software Foundation, a nonprofit that advocates for the free software movement, has called Copilot “unacceptable and unjust.” And Microsoft, GitHub and OpenAI are being sued in a class action lawsuit that accuses them of violating copyright law by allowing Copilot to regurgitate sections of licensed code without providing credit. But that’s not stopping Microsoft from commercializing it as soon as was possible!
Welcome to the chatGPT era of a corrupt OpenAI working in a trinity of the worship at the altar of A.I. Any questions?
An Added $9 for Privacy and Admin Control
Copilot for Business comes with a commitment that GitHub "won’t retain code snippets, store or share your code regardless if the data is from public repositories, private repositories, non-GitHub repositories, or local files."
That’s so kind of you GitHub, was that an “A.I. for Good” decision too?
TechCrunch goes on “GitHub’s attempt at rectifying this is a filter, first introduced to the Copilot platform in June, that checks code suggestions with their surrounding code of about 150 characters against public GitHub code and hides suggestions if there’s a match or “near match.” But it’s an imperfect measure.”
So in theory, business customers can rest assured that their super-secret, money-minting algorithm won't get sent to GitHub for product improvement.
Your Data is my Data
Copilot for Business, however, does transmit "engagement data," events related to editing actions (e.g. competitions accepted or dismissed), errors, and data like latency and feature use, including potentially personal data like pseudonymous identifiers.
Microsoft has found sneaky ways to protect itself legally it would seem: Whether Copilot for Business' promise to disregard the code suggestions it generates will deny data that could be used to improve future output isn't clear. Not a big surprise.
In November, lawyer and developer Matthew Butterick announced a lawsuit challenging Copilot, described as "an AI product that relies on unprecedented open-source software piracy." It wasn’t greatly covered in the mainstream media. I noticed the story was also absent from LinkedIn, which owns Microsoft.
But GitHub, aware that its enterprise customers might be put off by uncertain legal risk, has a standing offer to defend corporate clients against infringement claims based on Copilot output in its GitHub Copilot Product Specific Terms. How very generous of them.
Suffice to say that the A.I. of code is going to get messy in the years to come.
GitHub plans to introduce additional features in 2023 aimed at helping developers make informed decisions about whether to use Copilot’s suggestions, including the ability to identify strings matching public code with a reference to those repositories.
Individual Copilot users and Copilot for Business customers not under enterprise accounts will have to face any legal action on their own – if it comes to that. So that $9 is also paying for your immunity. Peculiar and sneaky GitHub!
Microsoft loves to brag and GitHub does a great job. GitHub says Copilot has helped redefine productivity for more than a million developers, boasting that it synthesizes as much as 40% of code. For researchers, GitHub has helped users code 55% faster, GitHub said.
Scraped by Microsoft - but definately A.I. for Good
Copilot, which was unveiled by Microsoft-owned GitHub in June 2021, is trained on public repositories of code scraped from the web, many of which are published with licenses that require anyone reusing the code to credit its creators.
So it turns out those $9 extra are wells spend IT managers!
For code safety, GitHub will not retain code snippets or store or share users’ code regardless if the data is from public repositories, private repositories, or local files. This, despite the tool itself being trained on billions of lines of publicly available code.
The ability for administrators to select which organizations, teams, and developers receive licenses.
Policy controls that allow administrators to enforce user settings for public code matching on behalf of an organization.
Privacy, legal protection, policy controls! Microsoft knows its customers.
Thanks for reading!