GitHub Copilot and AI for Developers: Potential and Pitfalls

I enjoyed the 45-minute MS Ignite talk GitHub Copilot and AI for Developers: Potential and Pitfalls by Scott Hanselman and Mark Downie as it revealed interesting insights in an entertaining way.

I liked that the deck and transcript are also shared. What made the talk engaging to me -

Anecdotes - Scott Hanselman is as much a gifted raconteur as he is a tech whiz. He can unscramble complex topics to both tech & non-tech audiences with empathy. His anecdotes have conversations peppered with thought provoking questions -

Is the AI in the room with me now? 

Is the AI going to hurt me? 

Is it a virus? 

Why is Alexa not smarter?

Should you talk to your computer? 

Should you treat it like a human? 

Should you be kind to it? 

Should we name them? 

We have to ask ourselves, how does it know (about a recipe)? How does it know anything about food? How do I know that the recipe is not going to kill me? 

Pop Culture References -  if you are cruel, if you are unkind, if you are mean, if you are impatient, if you say mean things to the AI, the next word will be mean, cruel, and unkind. But if you say nice things, if you compliment the AI, you're going to end up pulling from the nice parts of Stack Overflow and Google, and Bing, and the places where people are pleasant...if I said something like, you're a belligerent assistant, a lot like Benedict Cumberbatch as Sherlock Holmes. You're not super happy to be here, and you're going to give me what I need but you're not going to be thrilled about it. You're a little bit sassy. 

This is just mean. It's very quite cruel here. Look at this. Your lack of preference won't hinder my guidance. Here's a recipe. I can't promise it'll be stimulating as a true challenge for the intellect. It's a rather simple and mundane task. Suddenly it's rude. It will be a different flavor of rude depending on the prologue. 

Live Coding - Mark live coded to fix a bug by prompting CoPilot for assistance and then explained what was happening at each step. In the past non-AI world, an error while presenting code on stage could have been embarrassing. Mark on the other hand, pointed out that CoPilot had erred although the build succeeded. He proceeded to show how we can collaboratively work with CoPilot to point out its understanding gaps and arrive at the right solution.

Analogies & Examples - DasBlog implements the ActivityPub standard...that will allow blogs to communicate within the fediverse. Hanselman explains that in plain English -Twitter is awful, and there's another place, the fediverse, the federated verse, also called Mastodon, is an alternative social network. There's lots of social networks and just like you've got Threads and Facebook and Instagram, and all these different places, some of which have varying levels of awful. What you want to be able to do is you want to take the information on your website and syndicate it, just like Garfield writes a comic and it shows up in local newspapers. He's going to write his blog post and he wants it to show up in the feeds of people on Mastodon and Twitter and all of the different places, the collective federated universe or fediverse. 

Visual Studio is like Microsoft Word for programmers. 

Other excerpts -

When you talk to an AI, you're not talking to an intelligence, you're talking to something that picks the next likely word. 

Context is so fundamental when you are interacting with a Copilot. If you don't establish context, you essentially get more random suggestions. 

(Copilot can)...break down each part of that regular expression...in plain English. It's...infinitely patient. 

I want a Copilot that is going to have a conversation with me. It's going to be patient, it's going to be kind, it's going to be helpful and it's going to be that thing that I can turn to, that I can ask for advice. It's almost like having an infinite technical book. You could sit down with Copilot and say, I want to learn HTML today and have a whole conversation. It'll walk you through it like a tutorial and that's exactly what we expect from our systems. I don't want to say write me a website from scratch, it'll be fine, and then let it have the keys to the kingdom. 

(CoPilot can be)....confidently wrong. We need to make sure that we are able to decide for ourselves what's right and what's wrong. Having that critical thinking skill is super important. This is a tool, it's a spanner. We need to decide what we're going to do with it. 

Comments