Discover more from Datascience Learning Center
Does Voice Prompting Code Generation Have a Future?
GitHub Copilot feature lets developers code with their voice
What could be a secret power of Generative AI? How about coding with voice? It could be a ‘Hey, GitHub!’ moment, or at least Microsoft hopes so.
Copilot officially launched for everyone back in June, costing $10 per month or $100 per year after a free 60-day trial. By 2024 I think the Code generation leaders will have a useful voice indication feature. I see this as having a legit place in the tools software developers might use in the near future.
This is because Generative AI is moving fast now. If GitHub Copilot is doing it, others are too or won’t be far behind which will drive the opportunities here further along faster.
Think about it, how weird would voice prompting be? “Hey, GitHub!” will allow programmers to code with just their voice and no keyboard, just like how you’d speak to Siri, Alexa, or Google Assistant.
Let’s see how this might be useful:
Hey GitHub, you might actually have a future with voice.
Could Voice be more efficient in some instances?
Voice search is slowly becoming more popular, so how could GitHub Copilot bring it more into the mainstream?
GitHub is serving access to the new voice feature via a waitlist that’s open for interested developers now, but essentially it will allow developers to activate Copilot’s ears via the “Hey, GitHub” wake word. Obviously, this is just the beginning when it comes to Voice Generative AI at the intersection of coding.
Copilot suggests lines of code to developers inside their code editor, and it’s capable of suggesting the next line of code as developers type in an integrated development environment (IDE) like Visual Studio Code, Neovim, and JetBrains IDEs. While Microsoft is pushing it, it actually has many of really good competitors. These include Tabnine, GPT-Code-Clippy, Amazon CodeWhisperer and so many others.
The idea of autocomplete tools and auto-suggest and voice-coding is not new but they are starting to get to the point where they could be useful and part of the tool kit of new software developers in my opinions especially over the early 2020s as they refine themselves.
It’s also a win for Microsoft in terms of inclusion and accessibility. According to GitHub, Hey, GitHub! can recognize “natural language,” making Copilot a more user-friendly tool among programmers. This also boosts Copilot’s accessibility for other developers who find the conventional process of delivering code inputs difficult.
Other developers might just want to have voice-integration for easy navigation in some cases. So even if a developer doesn’t want any code suggestions, it can serve other practical use-cases such as helping them navigate a codebase by saying something like “Hey GitHub, go to line 34,” or even control the IDE by toggling to zen mode.
I like how Microsoft is pushing the envelope here with great ideas in coding even if some of them don’t stick. I noticed also the reviews for Tabnine were very decent. I think Generative AI startups will also mature in coding that aren’t yet out of stealth or have yet to be born as A.I. labs or startups.
For instance I’d not see Jasper.AI to also offer utility with regards to code as it has expanded into text-to-image. It’s a very well funded startup. When you can raise $125 million you are probably on to something.
What if you are a remote worker and not in an environment where you have your mouse and keyboard, could you actually work? While this is still an early stage experiment developed by an R&D team called GitHub Next (TechCrunch), it could have significant ramifications from an accessibility perspective, as it reduces the amount of interaction that’s required with a mouse and keyboard.
All of this is somewhat speculative and futuristic, and hard to imagine that even a beta Pilot is actually here. But the interest in Generative AI at the intersection of Code is going to be very high. Anything that makes engineers and software developers more productive will have huge implications and be able to attracting funding rather well.
If you are interested in trying out Tabnine, you can start a free trial here.
As for ‘Hey Github’ - What might the average developer actually use this for? I think the answer is speed. Just like some of us have gotten used to bossing around our A.I’s with voice commands the same could hold true at work.
Aside from writing and editing codes, Hey, GitHub! will allow programmers to do code navigation by simply indicating the line where they want to view (e.g., “Hey, GitHub! go to line 34,” “Hey, GitHub! go to method X,” or “Hey, GitHub! go to next block,”). It also accepts other VisualStudio Code commands such as “Toggle zen mode” and “run the program.” Additionally, experiment participants will be able to have quick access to code summarization in Hey, GitHub!, giving them summary explanations of certain code functions.
It’s all pretty interesting!