gzer0 a year ago

There are 5 different sign-up waiting lists. Each one needs to be signed-up for separately.

[1] Copilot X: https://github.com/github-copilot/chat_waitlist_signup/

[2] Copilot Voice: https://githubnext.com/projects/copilot-voice/

[3] Copilot Docs: https://githubnext.com/projects/copilot-for-docs/

[4] Copilot for PRs: https://githubnext.com/projects/copilot-for-pull-requests/

[5] Copilot CLI: https://githubnext.com/projects/copilot-cli/

  • bigmattystyles a year ago

    Soon - Introducing Copilot Copilot - to help you navigate the Copilot offerings.

    • amelius a year ago

      Coco-pilot, with a mascot that looks like Bonzi buddy in an airplane.

      • vanillax a year ago

        Underrated comment here

    • anticensor a year ago

      No, more like Github Autopilot - to write all the code for you.

  • fortylove a year ago

    This screams to me that each internal team just wants a piece of the glory and the accompanying praise and "way to get this across the finish line team!" emails.

    • usrnm a year ago

      You're forgetting promotions

  • drusepth a year ago

    > [4] Copilot for PRs: https://githubnext.com/projects/copilot-for-pull-requests/

    One minor nit for the Githubbers inevitably lurking in the comments:

    The confirmation message for [4] says that they'll contact you at [first email address listed on your account], not [primary email address listed on your account]. I hope that's just a display bug since I don't have access to the email I first created my account with (but want to keep it added for commit attribution). It's not listed as the primary email though, which is where I'd expect this kind of communication to route through!

    • simonw a year ago

      I had that problem too.

    • brightball a year ago

      I think I prefer the WhatTheDiff approach to this one.

  • pastor_bob a year ago

    $50+ in monthly subscription fees?

    BRB buying some MSFT

    • paxys a year ago

      We forgot very quickly that there was an entire generation of developers who paid $1000-2000 or more for an MSDN subscription just as the base cost for doing business. This is what Microsoft does.

      • axlee a year ago

        Or $299 a year for the Apple Developer Program just for the great honor to be able to develop for that platform...

        • booi a year ago

          that's not entirely fair... you also need a iDevice and some form of new-ish mac as well.

        • ericlewis a year ago

          $299 a year? It is $99.

          • drusepth a year ago

            The Developer Program plan is $99/year and enables distribution via the App Store.

            The Developer Enterprise Program plan is $299/year and enables corporate/internal distribution to employees.

            Most people only need the first, but the second is also an option. And if you need both, it's technically $398/year for the great honor.

            • atonse a year ago

              Most enterprises will spend more than $299 a year just discussing whether they should renew this subscription.

          • jjeaff a year ago

            $99 plus 15-30% of everything you create.

            • Terretta a year ago

              Try selling through Amazon.com and see what % you keep.

              Then try selling through a grocery or drugstore chain.

              • jjeaff a year ago

                Sure, but I don't have to sell on Amazon. I can sell through my own website and just pay transaction fees if I want. And many companies do just that now.

                But good luck trying to sell your new iphone app through your own website.

                At least with android, you could sell an APK download or sell through a different app store.

              • whimsicalism a year ago

                The difference being I can always walk to a different grocery store without having to pay ~$1k.

                • brookst a year ago

                  As a brand, do you get more value from being on Amazon or in a local grocery store?

                  • Breza a year ago

                    There's still more competition in the retail space and greater ability to sell across multiple stores simultaneously.

              • DeathArrow a year ago

                Have you tried selling apps trough Amazon or grocery stores?

      • xxpor a year ago

        the MSDN subscription was amazing for me to piggyback off of my dad as a kid though. access to essentially everything in MS's current and back catalog that was even remotely related to dev work, including OSes :)

        • spookthesunset a year ago

          The best was when you knew somebody who worked at microsoft and could get the company discount!

          • LASR a year ago

            If you knew someone working at MSFT, you would just ask them for some free keys.

            • nwatson a year ago

              My old MSDN keys for Windows from ten years ago will still unlock Windows 10/11 installs when I need a Windows VM.

      • reaperducer a year ago

        We forgot very quickly that there was an entire generation of developers who paid $1000-2000 or more for an MSDN subscription just as the base cost for doing business.

        Or $800 for a COBOL|FORTRAN|C|AP/L compiler, plus $600 for a debugger.

        At least when you upgraded the next year, you got $20 off!

      • keithnz a year ago

        msdn subscriptions were great! virtually every single piece of microsoft software. You'd get this huge folder of CDs back in the day!

      • nirav72 a year ago

        Did individual developers actually pay that? I’ve done quite a bit of development in the MS ecosystem and can’t recall anyone personally paying for MSDN. It was always through their employer that they got access.

        • zerkten a year ago

          Individual developers paying was always an insignificant volume. It was still within reach so devs could get it and be productive. As an example, the cost was fairly insignificant to the many .NET devs who jumped to SharePoint in the 2000s to keep their top rates. People give SP a hard time quite rightly, but as a revenue generator for devs in many markets with limited options, it was a very good choice.

          It was priced and designed for companies to be buying as a bundle with other licensing. individual devs would frequently use some of the loopholes through licensing partners to commit to what was basically a three year subscription that spread the cost. For companies, it could mitigate some of the costs associated with the proliferation of environments that needed to be licensed from top-to-bottom. Later, these may stop being eligible for MSDN use and generate more revenue. License auditing was real and gave visibility into this usage.

          VMware still have a program like this going for their admin user base. It gets recommended by the r/homelab folks who haven't adopted Proxmox.

        • WaxProlix a year ago

          Consultants and freelancers for sure did - though again that's technically a business expense.

          • ghaff a year ago

            I definitely had some sort of personal MSDN subscription at one point. Plus compilers and the tike were far from free. I had shelves of books. And computers themselves cost more in inflation-adjusted dollars.

        • paxys a year ago

          We will pretty soon get to a point where companies start to volume license Copilot for their entire engineering team, so really the same thing.

        • arwhatever a year ago

          I believe that they tended to hand it out for free or very inexpensive to students, substantial trial offerings to new/small businesses, etc.

      • cscurmudgeon a year ago

        While building off of free open source. Perfect.

    • csantini a year ago

      If you want it cheaper: https://how2terminal.com

      This is when I build a product and then Microsoft releases it three weeks later -_-

      • donmcronald a year ago

        That's super cool. I hate the pricing. I typically know everything I need to for day-to-day usage of the shell and only do things that require discovery every few months. 100 queries wouldn't be enough in those months, so I'd have some months where I'm paying for nothing and the odd month where I don't get enough usage.

        $9 per month also makes it costly enough that I wouldn't buy it as a "just to have" kind of tool. I don't think I'd get $100 of value vs searching online, especially since I attribute some negative value to tools that can be taken away from me. I don't want to pay forever and be dependent on something that could disappear tomorrow.

        I don't get why something like that needs to be an online service. I don't know much about AI, so maybe it's a lack of understanding on my part, but why can't I simply have a copy of the trained model on my local machine where there's no ongoing cost (to you) whenever I run a command? Isn't an online API a complex solution to a problem that could be solved with a local app + data?

        Maybe I just lack understanding and the models are too big or the compute required to make a query is huge. If you could give some insight I'd genuinely appreciate it.

        Even though I'd never buy it as a subscription, it's the kind of thing I'd pay for as a perpetual app. I'd wouldn't hesitate to pay $50 if I could install it on my machines and forget about it until it would be useful. I'd also expect to pay for updated versions of the models whenever I need them.

        Regardless, I think it's amazing as a discovery tool. I don't mind reading 'man' pages to figure out details, but I always feel like it's a hassle to discover what command I need for certain tasks.

        Also, I'm probably an outlier since I make a lot of effort to avoid tools that rely on an internet connection to function. IE: I won't rely on GitHub. I'll use it, but only as a push mirror.

      • MattSayar a year ago

        What a great landing page. Clear, to the point, with a self-explanatory demo. I know exactly what to do and how to use it in under 30 seconds

      • dotancohen a year ago

        Your pricing is terrific. The free plan provides just enough daily queries to try it, and the monthly plan might be a good fit for a business.

        I'm going to sign up for the free plan, not because I need such a tool, but rather so that my boss might see me using it and decide that it's worth $10 a month to her.

      • reaperducer a year ago

        If you want it cheaper: https://how2terminal.com

        Very cool. Too bad I don't do subscriptions. If you ever figure out how to do a version on localhost, I'd pay $40 or so for it.

      • dotancohen a year ago

        What is `!-*f(6s6U8Q9b` here:

          const questionCriteria = {
            filter: '!-*f(6s6U8Q9b' // body_markdown and link
          }
        
        I thought maybe it is a hard-coded CSS element name in StackOverflow answers, judging by the context, but it's not that. Could you shed some light on this?

        Found in the How2 source file `how2/lib/how2.js`. Thanks.*

      • shellfly a year ago

        I made a terminal AI as well based on ChatGPT: https://github.com/shellfly/aoi . I intend to provide a way to run the shell command automatically which can reduce lots copy and paste.

      • cess11 a year ago

        In case you'd like to know, there are a couple of typos on the landing page: focussed and Custome service.

    • berniedurfee a year ago

      I feel like this is one of those eras where I will, yet again, not invest early and will, yet again, not make stupid money on the other end.

      Cheers to those who know where to put their dollars to ride this AI bubble to early retirement!

    • raincole a year ago

      What worris me is not the price. It's that they mention it as a VSCode and VS feature, not as an API.

      So what if I use Intellij? And worse, would it eventually kill Intellij?

      • Godel_unicode a year ago
        • raincole a year ago

          ? Did you even read the thread? It's about Copilot X, not Copilot. I use Copilot in Intellij daily, and that's exact what I'm worrying about: once MS's AI tools get big enough, they won't be so friendly to third-party IDEs.

          It's not a stretch at all. Take a look at Copilot's homepage: https://github.com/features/copilot

          > Keep flying with your favorite editor

          Now, look at Copilot X's announcement:

          > We are bringing a chat interface to the editor that’s focused on developer scenarios and natively integrates with VS Code and Visual Studio

          > It recognizes what code a developer has typed, what error messages are shown, and it’s deeply embedded into the IDE.

          It reads like "we're going to make AI a competitive advantage of VS and VScode." Of course they have the right to do it, I'm just saying I hate it.

          • Godel_unicode a year ago

            Of course I did, did you read the forum rules?

            They’re releasing their editors first. Just like they did the last time. Obviously.

      • wahnfrieden a year ago

        Welcome to capitalism where competition is supposedly good

    • wysewun a year ago

      Great point. Time to do research on how much GitHub contributes to msft earnings. Really does seem like they’re on the verge of a huge jump

      • moonchrome a year ago

        They are bragging about 1m developers on copilot, let's say 20$/m to be generous - that's 20m$/month - they probably have larger single client revenues than that and I doubt copilot is high margin (I bet I spend more than 10$/month in compute by the queries I generate).

        Unless this goes to hundreds of dollars/month (which I would pay if it was a good experience) I don't see this making a dent.

        • theptip a year ago

          The upsell is easy to envision though. Just like GitHub.

          If developers all use it for their own projects, you have a viral growth engine. And you can charge a lot more for the enterprise license.

          SAML, fine-tuning on your private repos, etc. - this could be a very expensive seat with all the options added.

          • moonchrome a year ago

            They already have enterprise licenses at 20$/month I generously quoted above (personal is 10$/month), and I think they might have a tough sell when you consider privacy/IP concerns.

            I think they are just trying to establish themselves in the market and how the market will play out will be seen over the coming years.

        • mewpmewp2 a year ago

          Probably it will be using license for enterprises with decent margins, and it will be much higher cost then. Soon it's no-brainer for all enterprises to buy. I truly think that a little bit more time and engineering costs would be down 80% - 90%, because of all this tooling.

        • DeathArrow a year ago

          Maybe they think of waiting until everyone uses Copilot and then jack up the price.

          But that wouldn't quite work.

    • bg24 a year ago

      Where is the pricing? I looked around and not obvious if I missed it. Some of these used to be in copilot labs. I used and then disabled it. Was part of monthly copilot subscription.

    • tarruda a year ago

      It is basically free if you consider companies will be automating a lot of the dev grunt work for now on. Learning to code using tools like Copilot will be a requirement, but instead of 10 developers you hire 1.

      • Godel_unicode a year ago

        The amount of dev work isn’t fixed. I think it’s more likely that they’ll keep the 10 (or even hire more) and make more software.

    • rvz a year ago

      > BRB buying some MSFT

      No.

      Tell me you buy the hype and on high impact macro-economic news like FOMC and lose more money due to volatility without telling me.

      Almost everyone here knew GPT-4 will release soon, months ago in advance and you now decide to 'buy' MSFT as soon as it goes up to $277?

      The market really does get retail FOMO'ing in stocks with the hype and it just never ceases to surprise me that it happens here too.

      • bavila a year ago

        This "tell me X without telling me" meme is, frankly, quite snarky and unnecessary.

        • travisjungroth a year ago

          It’s such a bummer that it started as a joke confessional TikTok meme and became a bitter forum reply meme.

      • _boffin_ a year ago

        I think you have a valid point or two, but I think you’re not grasping how much enterprises will be throwing money at MSFT.

        • Godel_unicode a year ago

          GitHub is already give or take a billion dollar business, and Microsoft is orders of magnitude bigger than that. This will be a nice little sideline for someone but it’s at least as valuable for mindshare as revenue.

      • mewpmewp2 a year ago

        Based on my own productivity, and how strong the tools seem to be, then I presume current increase in the value is still too very low. These tools would eventually reduce workforce costs by 90% at the very least in my view. To me it seems like most people don't truly understand how things are going to change now.

        • vlunkr a year ago

          Maybe when they become much much more sophisticated. Right now they reduce time spent write boilerplate and looking at docs. How much of you overall time is spent on those tasks now? For most devs its not much. Significant time is spent reading code to understand program architecture, investigating bugs caused by obscure edge cases, refactoring or extending functionality without breaking everything. Copilot doesn't even scratch the surface of those yet.

          • mewpmewp2 a year ago

            I am very certain it has 5x my productivity already. But I have been practicing and using it pretty much 24/7 and as well on weekends. I think in 6 months it is so for everyone in very many fields.

            It also helps me understand new codebases in languages that I am not familiar with.

            In addition it is good for refactoring and can spot bugs.

            I use combination of chatgpt in editor and copilot. And I have ChatGPT one shortkey away as an os app.

          • DeathArrow a year ago

            It can find bugs, suggest fixes, write tests and do PRs.

  • smcleod a year ago

    I really like copilot, I've used it every day since it was in a closed alpha. Copilot voice however - is hilariously terrible in comparison.

    • adeelk93 a year ago

      As someone who is not familiar with these sorts of accessibility challenges, what makes copilot voice terrible?

      • smcleod a year ago

        It doesn't seem to understand anything I say, it also required installing Java! Felt like I'd gone back in time to 2012 or something.

        • re-thc a year ago

          What did you say?

    • tough a year ago

      talonvoice.com was developed by a develoepr that needed himself to use the AI voice to code tech

      • lahwran a year ago

        I use talon regularly and in fact had it active while browsing hacker news. this comment is written and submitted with talon!

  • sqs a year ago

    "GitHub Copilot X is currently a representation of GitHub’s vision for the future rather than an available product offering of GitHub Copilot. As we continue to design, test, and build features that fall into the GitHub Copilot X vision, we are also taking the time to determine the best way to provide them to our customers."

    From the FAQ at https://github.com/features/preview/copilot-x (When will GitHub Copilot X be available and how much does it cost?).

  • renewiltord a year ago

    Thank you. Better comment than the release docs.

    This is going to be fantastic!

  • cloudking a year ago

    Thanks saved a lot of time!

w-m a year ago

I tried getting myself on the CLI waitlist (https://githubnext.com/projects/copilot-cli/). That they use ffmpeg as the example hits a huge pain point for me. But:

"Next Waitlist by GitHub Next would like permission to: Act on your behalf

Not owned or operated by GitHub"

Why does signing up for a waitlist require me to give permissions to an app to act in my account? An app that isn't even officially from GitHub?

This sets a bad example for permission requests, getting people to just click through the dialog, which requests too much access, and from the wrong domain.

  • filmgirlcw a year ago

    It’s officially from GitHub. There is some weird language (that I’ll absolutely pass feedback on to the team), I think because GitHub Next is a different organization from GitHub proper, but I can assure you, this is an official GitHub app.

    But I totally agree that this isn’t a great/clear message about where this is from.

    • madeofpalk a year ago

      I raised this concern before about how unclear this is and the concern was pretty much dismissed https://news.ycombinator.com/item?id=33539150

      "Techies" often lament about how silly users are about falling for phishing tricks, but then they also routinely make it so difficult to determine what's legit from what's fake.

      • filmgirlcw a year ago

        Totally agree. I did the same double-take with some stuff at Universe and I work here. As Idan said back then, there are lots of reasons that we want Next to be separate from GitHub proper. But you’re absolutely correct that we should make it more clear that anything from Next is not a rogue org and also part of GitHub.

        Thanks for the feedback.

      • basch a year ago

        >routinely make it so difficult to determine what's legit from what's fake.

        Like the amount of times on my phone where I get prompted for a username and password different from the app im in (to make some kind of connection) but then suppress the url from visibility. An astoundingly poor design choice that has proliferated into every single variant of that interface design flow.

        • cyrusatjam a year ago

          Not to mention, the app you're in can inject JS into the page it renders. Best practice is to open in your default browser and then allow the deeplink in the callback_uri to return you back to the original app

      • xp84 a year ago

        > make it so difficult to determine what's legit from what's fake.

        So true. I still think the DNS was one of the only chances to grant, and teach, the ability to confidently make trust decisions in the general population. IDN attacks notwithstanding, it's hard to beat an inspectable string that contains its own trust chain, compared to "app names" that can often be set to the hacker's own choice. Sadly, antipatterns seem more common than useful patterns.

        (Antipatterns like using a domain like "talktotacobell . com" (don't visit that) as the site to complete a receipt survey. Or every public school district or even school having a random .com or .org.)

        People decided that end users would find an address like xhs.xsd.ed.ca.gov too complicated, yet people found 10-digit completely meaningless telephone numbers perfectly fine for decades.

      • BubbleRings a year ago

        This becoming the top thread here might get their attention. You have a very good point.

    • RulerOf a year ago

      The domain name on that authorization prompt was more than enough to get me to decline.

      I wouldn't have hesitated at all if it were at next.github.com.

      • filmgirlcw a year ago

        Totally get it.

        I’ve passed feedback on to the team so (hopefully) the CLI app will be be clearly from GitHub soon (we need to transfer it to the regular GitHub org and then there hopefully won’t be any confusion). GitHub Next exists in a different org from GitHub proper for lots of reasons, but we should definitely make it more clear that those experiments are still from GitHub.

        • rjzzleep a year ago

          To me the auth flow looks wrong. The CLI oauth would get act on your behalf, but the website to apply for access should not under any circumstance do that.

          Maybe it's just copy and pasted from somewhere, but it looks wrong to me regardless.

        • RulerOf a year ago

          FYI, after digging down a few links from there and winding up at the GitHub Next org page[1], seeing the verified email next@github.com and the official site link to https://githubnext.com was enough validation for me.

          Can't wait for Voice Copilot :)

          1: https://github.com/githubnext

        • vlovich123 a year ago

          Doing it under a liberated domain under GitHub.com sounds like it would work (still a separate org but it’s a signal that this is from main GitHub)

        • Kye a year ago

          It seems like one organization being able to specify a relationship with another, possibly with the ability to set a list of approved permissions, would solve this.

      • eitland a year ago

        The amount of companies, including serious banks, that doesn't understand this is mind blowing.

        Just the other day I had to verify with a Norwegian bank that the KYC form (which IMNSHO was utter nonsense as usual) that they linked to was actually them and not someone who had gained access to sneak in a link. Because the domain was something completely different.

    • idan a year ago

      Correct! We know this is a terribly confusing thing. Hopefully by the next big launch we can work out a way to have GitHub Next things not bear the "not owned or operated" thing.

    • someguyformdn a year ago

      Did you actually pay for or license MDN for the database that you are building off of and selling your "product"? YOU ARE A BUNCH OF THIEVES. People haave been working on contributing to the MDN brandand the docs since at least 1995.

      ```We’re starting with documentation for React, Azure Docs, and MDN, so we can learn and iterate quickly with the developers and users of these projects.```

    • Borgz a year ago

      Unfortunately, I think this is another indication of a lack of understanding by GitHub of how their OAuth/GitHub App systems are expected to function by end users.

      I'm reminded of this incident [1] from a few months ago. Allegedly, a malicious actor abused GitHub's poorly designed OAuth permissions to obtain up to 500 stars from developers without their consent, all thanks to a "sign in with GitHub" button and a flawed consent screen that did not communicate what the victims were consenting to. Even worse, GitHub allegedly decided to suspend at least one victim's account.

      We're left with a number of questions:

      1. Why does GitHub give third-party apps permission to star repos when it is apparently against the terms of service to automate such an action?

      2. Why does GitHub lump this permission in with public_repo, a scope that grants read and write access to all public repositories? [2]

      3. Why does the consent UI for this scope display simply as

          Repositories
          Public repositories
      
      and not even mention that this grants write access unless the user clicks on it? [3] (it also doesn't mention that it gives permission to star repos)

      4. Why does GitHub punish victims with account suspension for being tricked into giving consent to malicious apps?

      It is good that GitHub is taking some steps to improve account security, such as fine-grained personal access tokens and mandatory 2FA. But these improvements do not seem to be extending to the OAuth system. The GitHub App system, while better in that it has granular permissions, is also flawed with its mysterious "act on your behalf" consent UI. [4] [5]

      [1]: https://news.ycombinator.com/item?id=33917962

      [2]: https://docs.github.com/en/developers/apps/building-oauth-ap...

      [3]: https://news.ycombinator.com/item?id=33919481

      [4]: https://github.com/community/community/discussions/37117

      [5]: https://github.com/cirruslabs/cirrus-ci-docs/issues/751

  • macNchz a year ago

    It's interesting because the Copilot for PRs preview uses an app that does say "Owned and Operated by GitHub". Probably just an oversight, but I'm pretty sensitive about what access GitHub apps have, especially given the list of major attacks that have happened in recent years that were initiated by someone sneaking malicious payloads in somewhere along the development pipeline.

    Curious why they'd use a totally different domain in the first place–how much access do you think you could get to private repos just by cloning that page onto githubfuture.com (available) and spearphishing interesting targets?

    • w-m a year ago

      Signing up for the technical preview of Copilot Chat also worked as expected, it was just a checkmark and a button on github.com, not even an oauth dialog.

  • Bedon292 a year ago

    I was wondering the same thing. It seems like it is labeled weirdly since it is part of GitHub, but definitely was triple checking what was going on with that.

  • ilaksh a year ago

    There are several CLI programs like this. I built mine before ChatGPT came out so it uses text-davinci-003. It https://github.com/runvnc/askleo If you google for ChatGPT CLI I assume some will come up and those are probably better because they are cheaper and faster.

  • localhost a year ago

    While I don't remember to use it (too much AI in too many tools!) Warp has this feature as well "Warp AI". This is what it told me for the demo scenario:

      ffmpeg -i input.mp4 -i watermark.png -filter_complex "overlay=10:10" output.mp4
    
    Bunch of descriptive words too.
  • envy2 a year ago

    I think this is just a standard API access prompt. The GitHub Next page clearly indicates it is (C) GitHub Inc and the blog post makes GitHub's control clear.

    • madeofpalk a year ago

      > The GitHub Next page clearly indicates it is (C) GitHub Inc and the blog post makes GitHub's control clear.

      This is not clear at all. Phishing websites always say (C) Bank of America. That doesn't make it legit.

    • retrocryptid a year ago

      Well. That's what the AI wants you to believe.

  • darepublic a year ago

    +1 for copilot is good for ffmpeg. Recently did a project involving complicated ffmpeg filters, it could handle them on it's own quite well.

  • pastor_bob a year ago

    Interesting, how does this know what executables are available on your PATH?

    • nicky0 a year ago

      I guess since it runs does a shell executable, it can simply examine the path.

  • joshmanders a year ago

    > Why does signing up for a waitlist require me to give permissions to an app to act in my account? An app that isn't even officially from GitHub?

    Because it uses GitHub's oauth flow, all applications who use this flow show the same stuff.

    • justin_oaks a year ago

      OAuth allows the use of different scopes for different levels of access. So just because it uses OAuth doesn't mean it has to require major account access.

      I assume the problem is that GitHub doesn't have sufficiently granular permissions (OAuth scopes), or that the requesting app doesn't use the correct scopes.

drdrek a year ago

I would love to hear from other people here but I tried it for a while and dropped my subscription. For short things it was nice, but the larger the suggestion the worst it got. I found myself constantly mentally debugging the output it suggested, I do not know if it was faster but I was mentally exhausted by it, unable to go for long periods of time. Dumb code completion is predictable, I know I'll need to press down three times before its even shown to me, enabling me to think forward. With "smart" code completion I need to constantly stop my train of thought to process what ever it throws at me. I even tried to just trust the system, playing it fast and loose, not double checking everything, but then it just produced bugs.

I'm developer with 16 years of experience, currently working mainly with python for API work.

I'd love to hear accounts of other people, please add your background if you feel comfortable. I want to see if there is some correlation to experience, programming languages or use cases.

  • throwaway4aday a year ago

    I currently use copilot and 80% of the benefit I get from it is boilerplate and refactoring, the rest is just using it as smart autocomplete where I can zip through adding a bunch of properties or arguments. I use it for a lot of greenfield stuff though which is really where it shines since a lot of that work is just standing up all the essential bits and pieces before you have to do anything complex. It's helped me massively with going from an idea to a working implementation both by getting rid of a lot of the boring typing and by keeping me going when I started to get a bit run down or uninspired.

    I can understand why you might not like it if you were using it for critical things that needed to be well planned and debugged before running. I don't find it very good at intricate work but that's ok with me since I want to really slow down at those points and think about what I'm doing.

    As a side note, using ChatGPT with GPT-4 or even just GPT turbo is an amazing unblocker for projects where you need to use unfamiliar packages, APIs or languages. You can just talk to it about what you're trying to do and it'll provide you with great examples and explanations. It won't be right 100% of the time but it's right enough to get you unstuck and a lot faster than searching through docs or stackoverflow for a good answer. It helps to be very precise with your problem statement as well, like specifying the version of the package you want to use or a time frame. Those little prompt tricks remind me a lot of the Google-fu we had to learn to search effectively. I'm excited that Copilot is going to be moving to GPT-4 with chat built in, it'll unify the whole process.

  • babl-yc a year ago

    (Software dev for 15+ years)

    I'm using it for Typescript + NodeJS development. I find Copilot most valuable when it's something I'd need to Google anyway, like how to format a date string or how to do X in selenium. 8 out of 10 times the answer is right, and the other times it is at least interesting (gives me an idea of what to look for).

    This quick feedback is _way_ faster than googling and keeps me in the IDE, and also just makes it more enjoyable to code when there is this "pair programming" partner that I can interact with via code/comments and it will generate ideas for me, even if they aren't all perfect.

  • yanis_t a year ago

    I mostly code in JS/TS and ruby, and find it pretty handy. Especially when I need to write some unit tests, I usually only write it("should work like this and like that"), and 90% of the time it generates a useful unit test[0].

    Also, I find it very useful when I code in an unfamiliar environment, for example python. While usually I know what I want to do, I'm not exactly familiar with details of syntax's, or how a specific library api looks like. This is where it shines, in my experience. Huge time-saver.

    [0] https://www.strictmode.io/articles/using-github-copilot-for-...

  • photochemsyn a year ago

    That's been my (limited) experience, you have to debug its output and if you don't you have problems later. Pasting CoPilot's code into ChatGPT was sort of interesting but not really a time-saver, although pretty useful for understanding new concepts. Ultimately I'd still have to go read the documentation to actually understand how to use something new correctly however. Not really sure if it's worth $ for.

    Where it is pretty useful I think is in examining large chunks of poorly commented code bases, where you're using CoPilot to generate comments describing what the code is supposed to be doing, i.e. '#here we ..'.

  • aloer a year ago

    My trial ran out yesterday and I cancelled the subscription. It feels limiting on its own. There are three use cases off the top of my head

    - auto complete. This one is amazing but I’m not willing to spend 10/month for only that

    - generate code inline. The main purpose of copilot. It works okay but too often I feel like I’m faster if I google things myself. Perhaps I’m just a very fast googler and reader. Wouldn’t surprise me

    - use a solution and adapt it to my own code. I don’t know how this could work without copilot having access to my browser and knowing what I just read somewhere. I’m very excited about this but right now copilot does not seem the right tool

    But above all I cancelled for two reasons: it’s too slow and I can’t trust my privacy and code IP to be respected

    Edit: I also feel there is a lot of secondary information lost. If I google I have multiple tabs and windows and (temporal) structure. I also learn about neighboring concepts via stackoverflow comments, or I learn about how to navigate the docs for whatever I’m doing right now.

    With copilot I am not exposed to all of this. Not yet

    • ChatGTP a year ago

      Edit: I also feel there is a lot of secondary information lost. If I google I have multiple tabs and windows and (temporal) structure. I also learn about neighboring concepts via stackoverflow comments, or I learn about how to navigate the docs for whatever I’m doing right now.

      This! I think what we lose from these auto-completion systems is a differing of opinions, options and abilities to learn and evolve our ideas.

      I don't always want the most commonly weighted average answer. I want to see, learn and try new things.

      But above all I cancelled for two reasons: it’s too slow and I can’t trust my privacy and code IP to be respected

      I'm starting to look for GitHub alternatives for this reason, I'm sure it's no surprise where they're getting all the great training data from, it's our code. Which might be ok, but no you won't even be able to innovate to make some money because co-pilot will be reading all your suggestions.

      I wonder if Microsoft trains it on their own proprietary code? :)

  • dimal a year ago

    I haven't found it very useful. It's been useful for writing tests, but in most other stuff it's like an annoying person who's always finishing your sentences for you... wrong. It'll autocomplete stuff like file paths and just make up garbage, which messes up my intellisense autocomplete which would actually get it right.

    I'm on the trial and I'll probably cancel it before I have to pay. If it was easier to turn it on and off or only use at specific moments, I might keep it, but as it is, it hasn't been great for me. I have 25 years of experience. Maybe if I was younger, I'd like it, but at this point, I'm usually not typing something unless I know what I want to type, so it's just a distraction.

  • Rapzid a year ago

    Late reply but in the same boat with VSCode. God help anyone not using a typed language to help more easily catch copilot's BS; it regularly just makes up method and interfaces that don't exist.

    It interferes too much with the standard autocomplete/intellisense. I want to use copilot in an on-demand fashion for help with libs I'm not familiar with and scaffolding, but the team seems dead-set on having it always on and in your face; no settings to swap around the default for normal autocomplete.

    Maybe I can try toggling it on and off more aggressively. Perhaps Copilot-X's prompt will be more useful for me. IDK but the experience is disappointing especially compared to the potential.

    Edit: Also I alluded to this but it's just plane wrong a lot. So the inline suggestions are like 75% wrong and I'm fighting to get my autocomplete to show up instead.

  • ddren a year ago

    I used it for a while but I found that too many suggestions are worthless and having to consider them makes me waste more time than just writing the code myself. For the things that it is useful like snippets, I found that ChatGPT is better anyway.

  • Myrmornis a year ago

    I don't know about the new developments but with last year's copilot, it's great as long as you don't have it on all the time. Just turn it on when you want it to complete something. Unfortunately VSCode didn't/doesn't make that easy -- you have to hack something together yourself:

    https://github.com/orgs/community/discussions/7553#discussio...

    • Epskampie a year ago

      Totally agree, I cancelled my subscription because the way the extension works (worked?) is just way too distracting, shoving (often wrong) suggestions in my face all the time.

      It should have an mode where it only ever suggests a single best guess when I press a certain shortcuts.

  • raincole a year ago

    At this point, Copilot is as natural as autocomplete and syntax highlighting to me. Of course I can still write code without them, but it would feels really off and counter-productive.

  • brundolf a year ago

    I haven't tried it yet, but this was how I assumed I would feel

    Something I'm actually more optimistic/curious about is the potential for code analysis, like an advanced linter. "Tell me if you think you see any bugs in this code." That's something I would totally use even as an experienced programmer working in an older, slow-going codebase (maybe especially in those)

    • tomaskafka a year ago

      That's exactly what I'm waiting for, either with Copilot X, or GPT4 32k API access.

  • nprateem a year ago

    It mostly just gets in my way by being frequently wrong, and disabling my IDE's autocomplete. If I didn't get it free I wouldn't pay for it.

  • pragmatick a year ago

    I also didn't find it very helpful. It would constantly suggest code that didn't even compile, e.g. suggest calling setters for fields that weren't there. I spent so much time reading the suggestions or deleting them after using them and finding out they were bullshit that I didn't save any time at all.

  • web3-is-a-scam a year ago

    started using copilot this week as a trial and my experience so far echoes yours, Ruby dev doing lots of api work and I’m spending way more time mentally debugging all the hallucinated variables and functions it vomits out it, and fixing incorrect interfaces and I find it way more exhausting than just writing the code myself

    IntelliJ inspections and refactorings blow copilot out of the water it’s not even a contest.

  • phist_mcgee a year ago

    Copilot for generating TS prop types on react components is fantastic, or guessing the write type for a library object's type.

  • danielvaughn a year ago

    I haven't used copilot but your experience sounds exactly like what I would expect. Since AI is based on prediction, it makes sense that broader predictions would be less accurate. I think stringing together output from a lot of smaller predictions would yield better results. Which, at the end of the day, means that a human + AI will always be more productive than AI on its own. At least for the foreseeable future.

elil17 a year ago

Yesterday, Copilot could not write a program with SymPy, which is a library for doing symbolic math in Python. (e.g., it would consistently confuse symbols in equations with Python variables). Today it uses SymPy as well as it uses NumPy (occasional mistakes, but overall it has the right idea). Absolutely astounding.

  • toomuchtodo a year ago

    I don’t think people understand the hockey stick we’re on. Don’t focus too hard on the failings, look at the scaffolding being rapidly deployed that makes the iterative improvement process exceptionally tight.

    • brookst a year ago

      It’s the same with every paradigm shift.

      “GUIs are pretty but slow. Ok, they’re faster but just glued on to CLI programs. Ok, they’re native but less information dense. Ok, that’s a good thing, but…”

      “The internet is nice but you still have to go to the store/bank. Ok, you can shop/bank online but you still have to call airlines. Ok you can buy airline tickets but you still have to go to the library to research. Ok, you can research but you still have to drive in to work. Ok…”

      Maybe it’s a half empty / half full thing, but there is some portion of the populace that can’t acknowledge the promise of anything until it is perfect and comprehensive.

      And, yes, there’s another portion that can’t acknowledge gaps or flaws of the promise is sufficiently huge. But it’s the former that grinds my gears.

      • scandox a year ago

        Your overall point is reasonable, but to take your point about GUIs...we actually lost an awful lot moving from text based interfaces going to GUI. We got universal adoption and more or less zero training required...but we got very slow and inflexible interfaces.

        • GeneralMaximus a year ago

          I use tons of GUI apps with mostly a keyboard (or sometimes only a keyboard). For example, my Git client of choice is Tower, and I’m able to get all my work done without taking my hands off the keyboard. It took me a bit of time to get used to the shortcuts, but now I’m able to work off my muscle memory.

          Well written GUI apps will always have good support for keyboard-only use. This is doubly true for apps that care deeply about accessibility, since some people have disabilities that make it hard for them to use a mouse or trackpad.

          On the other hand, TUIs will never support non-Latin scripts properly, or have good support for screen readers, or be easily discoverable without a manual, or have uniform automation support using native automation APIs. In my view, CLIs are strictly inferior to GUIs unless you’re putting together data pipelines.

        • chaxor a year ago

          Command line didn't vanish ... You can still use it. Approximately 90% of my work is done via TUI, so it's certainly not gone.

          • scandox a year ago

            I'm talking about people behind counters, people in stock rooms, people in all kinds of data entry and so on. The terminal did vanish from many places. We're not the market.

            • aprinsen a year ago

              I worked at a paint store in my teens (2010s) that operated a beautiful text only PoS system that had been built decades ago and it was incredible. So efficient, purpose built, nary a click.

              • roguas a year ago

                Yeah and owner would have to have SWE on payroll to continue having it working. It's why people migrated to a more general, albeit sometimes also more confusing apps/platforms.

                • aprinsen a year ago

                  It was a chain (Sherwin Williams), so I assume they could afford an in house dev team.

      • headcanon a year ago

        I would assert that the "here are the current flaws" side is still necessary in the overall conversation, since they provide the necessary feedback that allows technology to improve. Similar to political debates influencing the Overton window of a population, I see the "future promise" vs "current flaws" as a human algorithm realizing itself. The debate never ends, but the "mean" value keeps moving as technology progresses. As a "future promise" person myself I used to be annoyed by the "current flaws" side as well until I started seeing things through this lens.

      • darkwater a year ago

        I'm speaking more generally but

        > Maybe it’s a half empty / half full thing, but there is some portion of the populace that can’t acknowledge the promise of anything until it is perfect and comprehensive.

        Probably because paradigm shifts or new ways of doing things come with a lot of trade-offs that can be seen only after the change has been absorbed more. Early adopters are usually blind to these issues (or decide to be blind at them), conservatives don't look at the improvements but think that there might be something not that good in the future by projecting past experiences.

    • agent281 a year ago

      I'm mostly curious how far the hockey stick will go up. Eventually most things level off.

      • throwaway4aday a year ago

        Considering this hockey stick is in the field that is in the business of making hockey sticks it could go pretty far. A hockey stick maximizer if you will.

      • substation13 a year ago

        This is the big question isn't it. With self driving cars we have been on the final 20% for what seems like forever.

        • elil17 a year ago

          Big difference here is that it becomes more useful as it gets better. Self driving cars aren't useful until they reach a certain threshold.

          • substation13 a year ago

            I think there is a feedback loop with self-driving cars.

            More self-driving cars on the road -> more driving data for that company -> better self driving cars

        • xiphias2 a year ago

          We are seeing continuous improvements in image and video generation / understanding every year. It was just probably too naive to think that self driving can work well without getting to human level video understanding (which is still not AGI).

          But when we have that, it’s hard to believe that self driving won’t work.

      • danenania a year ago

        To me the key issue are these 'hallucinations'--mistakes that seem plausible but are completely made up, like API endpoints that would be super useful except for the small problem that they don't exist. GPT4 is better than GPT3 on these but it still produces a lot of them.

        The question is whether these are somehow inherent to the LLM approach or whether scaling up and continued improvements can eventually get rid of them.

        They are the main barrier at this point between a very useful tool, but one that still needs to have all its output carefully checked by humans when it comes to anything important, and a true autonomous agent that can be given full tasks to do on its own.

        • elil17 a year ago

          It seems pretty clear to me that you could do some more RL to enforce truth-telling/admitting when it does not know - it would just be much more labor intensive compared to the RLHF they have already done because fact checking is difficult.

          • danenania a year ago

            I'd imagine they've already been doing lots of RL in this direction, which explains the improvements in GPT4, but it's still an issue. Maybe they can eventually eliminate hallucinations completely, but I could also imagine that it will end up being difficult to do that without lessening its creativity across the board. Perhaps making things up is fundamental to how LLMs work and trying to stop it from doing that will kill the magic. I'm not an AI researcher so I really have no idea--just speculating.

            I'm not at all trying to downplay the power or significance of LLMs, btw, in case that's why I'm getting downvoted... I'm using copilot/GPT4 every day and they are massive productivity boosters. But currently I see them as tools for producing rough drafts that need to be revised and checked over. If they can't solve hallucinations, LLMs will stay in this lane, which is still incredible, amazing, and useful, but won't necessarily get us to the AI endgame that the hype is predicting.

  • vintermann a year ago

    Yes, I feel like it is much better than HN has given it credit for in earlier discussions. But that may be influenced by the fact that I only started using it relatively recently; I've probably mostly seen it at its best.

    Also, I have no illusions about how language models work. I notice that the sloppier the code I'm editing, the worse suggestions (and dumber comments) it suggests. If I start writing a comment and copilot immediately suggests "TODO: delete this", that's a better indicator of poor code health than any metric I know about. It's not a magic wand, but it can be extremely useful if you use it sensibly.

    Then again, you can just decide to forget how it works and ask it questions directly in comments:

        // Hey GPT3, write me a poem
        // ANSWER: 
    
    It's not sensible, but it's great fun.
    • Kiro a year ago

      You're being too kind on the HN crowd. We're usually extremely pessimistic for no good reason and useless at predicting things. It seems like we just love to hate on stuff.

      • acdha a year ago

        People here love hot takes, especially if it stakes out a contrarian position as the smart one. Criticizing the big players is just too easy to do that with since it doesn’t require you to build an alternative and is often right in some aspect so you can get validation.

      • qgin a year ago

        It's definitely not considered high status here to be positive about anything

      • kreas a year ago

        I would say this about most developers not just HN. It took years for me to break that mindset.

      • vintermann a year ago

        If you think that you should see what Slashdot is like these days, lol.

  • gunshai a year ago

    I used sympy the other day for the first time. I was blown away by that library I wish I had known about this years ago.

  • DeathArrow a year ago

    They automatically switched Copilot to GPT 4 without you enrolling in the limited access list?

  • barrenko a year ago

    They were probably apprehensive beforehand, and now just unleashed it on everything.

  • retrocryptid a year ago

    So it's on par with a junior software engineer?

    • elil17 a year ago

      No, I don't think it's comparable to a junior engineer. It works best for pretty short snippets of code. It also can't decide when to run the program and how to interpret/make changes based on the results.

      • steve_adams_86 a year ago

        > It also can't decide when to run the program and how to interpret/make changes based on the results.

        Not quite, but a bit of plumbing can get you closer. Not human using a computer close, but interestingly closer nonetheless.

        I've been trying to accomplish something akin to this by having a program monitor and alter another program within a virtual machine, using GPT-generated solutions to error traces to correct bugs in the sand boxed program.

        It watches the program to see when an error occurs, feeds the error to GPT with pertinent code, then tries to splice in the solution.

        It kind of works. I don't think we're going to see human-levels of success from this in the immediate future, but I was able to write a simple event-based system which alters a program to resolve simple bugs. It even does it on a different git branch, and there is some stubbed out code and prompts for generating tests. In my manual testing, this actually worked too. If the tests passed I was going to have it push the change set and create a PR explaining the changes, tests, etc.

        I doubt I'll continue now that Copilot is doing this already. My point though is that with the right configuration, the right data and prompts, and a system orchestrating the start/stop/test patterns based on the state of the sandboxed program, you can begin to achieve something akin to an inexperienced person solving bugs.

        Sometimes it does a terrible job and other times it kind of falls over itself. But we're already leaps and bounds ahead of previous systems, and I just cobbled this together with what's possible via OpenAI's API.

        The crazy part is that there are so many possible layers. Like say we get our initial solution and we verify that it works. Well, now we can have a system which optimizes the implementation. Like a PR buddy that observes the implementation and determines: should this test be appended to an existing suite of tests? Can the test case simply be added to an existing table-driven test? How can we streamline this patch to avoid an endless stream of additional files and tests to maintain? I think that's actually tractable already. While the success rate won't be 100% today, it'll clearly only improve.

      • mynameisvlad a year ago

        So… a junior developer?

        I (mostly) kid but some seniors I know struggle at those things.

        • elil17 a year ago

          I mean I get what you're saying. But it doesn't just struggle with those things - they aren't even the sort of things it can do.

          • retrocryptid a year ago

            Yes. Of all the comments I've made on HN, that one most deserves to be down-voted. Jr. Devs I know are motivated and reasonably intelligent.

    • vintermann a year ago

      Sure. One that never gets tired, never gets frustrated, always does his best, and has a pause button.

      • geraneum a year ago

        This applies to managers and many founders too. I imagine it's gonna affect the startup scene more than people expect and the number of current model of startups where you build a custom software solution for a problem may also fall. It applies to all white collar workers. This is the goal of a corporation like Microsoft to develop such a system (to consolidate those added values), and it will happen one day as it happened with physical labor. Although we are not quite there yet. It might be far or not. Who knows?

jchw a year ago

Not very impressed with either Copilot or ChatGPT for programming purposes. I'm sure it will be interesting for some people, but whenever I really feel like I could benefit from an intelligent AI, it usually falls flat. Most commonly, it will just generate calls that don't exist, and if you try to get it to implement them, it will write bullshit. Now the thing is, maybe it really does improve developer productivity by virtue of letting the developer autocomplete boring things. To be fair, I did see some opportunity here. But on the other hand, it also seemed to harm productivity, as I had to carefully review everything it did, and eventually had to disable Copilot so that it would stop annoying me with nonsense completions every time I stopped typing. I'm not really sure if language models are actually getting closer to what we really want, or if they're getting closer to some other maxima that just looks similar. Clearly they're doing something that is novel and probably even useful, but with increasing parameter counts it seems that the underlying problems are not changing. Something certainly seems like it must be fundamentally wrong...

  • teaearlgraycold a year ago

    I use them all the time with success. For copilot it’s very useful for writing tests and autocompleting repetitive code. Like writing a Typescript interface that matches a JSON object, or listing properties of an object into an array. ChatGPT is at worst a good rubber duck, and at best a superpowered documentation tool. If I ever get confused by docker or some popular JavaScript library it’s able to give me a custom tailored answer that usually solves my problem on the first or second try.

    Neither of them are any good at just writing a ton of code (unless the code is mostly boilerplate). But they’re amazing keystroke savers and a next generation stack overflow.

    • 8organicbits a year ago

      How often are you seeing bogus documentation? I see that concern often mentioned. I tried ChatGPT and the first question got an obviously wrong answer. That seems worse than stackoverflow to me. Maybe I'm not holding it right?

      • teaearlgraycold a year ago

        I think it entirely depends on how popular the library is. React for example is so common online that GPT has the whole API figured out.

        Even if it’s wrong the stakes are low. Just paste the error back into the chat and it will give you an alternative.

        But the real solution here is the new Copilot X for docs. Feed in the code base and all documentation and now you have the real source of truth at the helm.

    • danenania a year ago

      What would be amazing for Copilot + Typescript is if it could somehow integrate with the TS compiler (or perhaps just get sufficient fine-tuning on the TS type system) so that it only generates valid/type-safe suggestions.

      • FractalHQ a year ago

        I think about this often. If copilot could hook into the Typescript language server, it would be much less likely to hallucinate and generate blatantly wrong suggestions.

  • thebackup a year ago

    I second this. My impressions so far are pretty much the same. Have been trying to get ChatGPT to generate code for some algorithms that are pretty close to well known implementations but with some twists and it absolutely fails no matter how much I tried to provide hints to change the suggestions it gave me. In the end I just went back to hand coding them since it actually saves me the frustration of trying to get ChatGPT to bend in the direction I need.

    • Donald a year ago

      Have you tried GPT-4?

      • thebackup a year ago

        Will give it a go. But probably it’ll take another generation or two before it’s good enough.

        • Donald a year ago

          Cool. I’d be interested in the algorithm that you’re giving it to write and what problem it is having with the output.

  • woeirua a year ago

    ChatGPT << GPT4 when it comes to coding.

ren_engineer a year ago

Microsoft really seems to have wrapped up the developer ecosystem with VS Code and the Github acquisition combined with OpenAI. They are going to have an absurd amount of data to optimize their models thanks to that, not sure how other AI focused companies can overcome that

have to wonder at what point developers remember how anti-developer Microsoft used to be and potentially move away from their ecosystem. Credit to Microsoft's PR team for somehow managing to turn around public opinion about them, it's an all timer

  • fossuser a year ago

    Microsoft is a so well positioned on this I think Silicon Valley has forgotten what it looks like when they don’t have a horrible CEO. When Microsoft executes well, they’re a scary force. They were exceptional under Gates and it looks like they are again.

    There’s a massive paradigm shift we’re just at the beginning of and Microsoft has been putting pieces in place for the last couple of years. Nadella has really turned things around for them.

    Google and meta are scrambling.

    Things should get interesting.

    • hbn a year ago

      I wouldn't go so far as to say they're exceptional. They're gaining good graces for developers by offering a bunch of free (for now) tools that people like. But most of their big breadwinner software is still abysmal to use.

      I just tried cold starting Word and getting into a blank document on my i9 MacBook Pro and it took about 15 seconds. Windows is increasingly a dystopian user tracking and ad serving platform that happens to run the applications you need, and they still constantly break basic system functions. For a while I couldn't open jpegs in the built-in photo preview app. I gave Edge a shot for a good 6 months or so and eventually gave up because they kept breaking basic functionality that I never saw broken in Chrome (there was literally a month or two point in time where if you grabbed the scrollbar and dragged it, the bar would disappear, jutting the entire page sideways and breaking scrolling until you closed the tab and reopened it. This is on the default, built-in OS browser!) Also, my god, their SSO experience is so clunky and prone to breaking.

      • fossuser a year ago

        They’re strategically exceptional, but beyond that a lot of their products (or subsidiaries) are also dominant for good reason (vscode, GitHub).

        O365 and the product integration there (teams) is why they crushed slack despite slack’s headstart. Slack sold out to languish at salesforce while Microsoft will now just own that space.

        Cherry picking a Microsoft app for macOS is like looking at iTunes on windows, it’s not really representative. Even Windows itself isn’t that important (that was one of the big strategic changes after Ballmer left and Nadella took over). You’re right about edge though (which is why Microsoft abandoned it for WebKit).

        Microsoft also ships (something Google can’t do very well). I think their current trajectory is probably undervalued because people have not properly updated from outdated historical sentiment.

        • jackosdev a year ago

          vscode is an incredible piece of software, better than all the paid options in my opinion, the amount of features they pump out month to month is outstanding, just a bit slow due to electron. I never understood why they put so much effort into a free product that I run from Linux and Mac, but I'm happily paying the copilot subscription so it all makes sense now.

          • bombela a year ago

            It's free and electron because they can run it in a web browser, running everything on MS Azure. With code on GitHub and CI on GitHub etc etc. The whole dev experience offered to companies as a service via a series of web applications. Companies will love this.

            Just get any web browser, preferably Microsoft Edge on a Microsoft Window Pro on a Microsoft Surface laptop. Open Microsoft GitHub workspace. To dev for your Microsoft Azure hosted Linux VM. Run the CI on GitHub. Use Microsoft O365 for your design doc. And Microsoft Team for communication.

            Poor little Linux in the middle.

            • CoolCold a year ago

              Why poor little Linux? Those got what they wanted.

              No gloomy project managers above you, just write the code you like, express yourself? Check

              No telemetry to know what average Joe The Normie uses and wants? Check

              No spending time on meetings and plannings, boring strategy discussions, just do a bit of here and there what your soul wants today? Check

              Love to tinker and customize your setup without leaving a chance for IT department to standardize on software and settings rollouts, no MDM covering YOUR system ? You are out of enterprise - Check

              Dreams came true, why poor?

              • bombela a year ago

                Because the linux desktop is forbidden at many tech companies nowadays, for the very reasons you wrote!

                At a company I worked, overtime they wanted everybody on Mac or Windows. No code locally, only ssh onto a Linux VM. When you argue you can also ssh from Linux, the response is: we cannot run the spywares on Linux.

                Sure you can change job. But I have noticed the trend all over among my circle of friends.

          • withinboredom a year ago

            You honestly had me do a double take. Copilot costs money? Maybe I get it through some other thing, but its been free for me as long as I can remember. It's wrong so often that I generally keep it on because it's entertaining. I wouldn't pay for it.

          • nunobrito a year ago

            Would have to differ. Best IDE would have the Borland Pascal 7/C back in the 90s, then Delphi and eventually Netbeans/Eclipse to take that position as something worthwhile between Linux, Mac and Windows.

            For me Delphi Pascal was the pinacle of compilers/IDE combos. A simply fantastic combination of GUI editor, assembler support, fast compiler and truly useful documentation with pratical examples at the click of a button without needing internet.

            VScode with a proper copilot seems to be a game changer. Crossing fingers.

          • theshrike79 a year ago

            Yep. IntelliJ's stuff is the only one that actually competes with VSCode - and the playing field is surprisingly even if you add the VSCode plugin ecosystem to it.

            For C# Rider is still the gold standard in my book, but for Go I still prefer VSCode to GoLand.

          • insane_dreamer a year ago

            better than PyCharm for python?

            • jackosdev a year ago

              Nah that editor is much better if you're just using Python, but vscode imo is better for polyglots, the experience and keybinds stay the same across languages once you work out `tasks.json` and `launch.json`

            • OOPMan a year ago

              Nope, but VsCode addicts are usually too cheap to try a paid-for tool like PyCharm

              • snotrockets a year ago

                There’s a free, open source, community edition, which provides most of the functionality

        • zelphirkalt a year ago

          Slack sucks, but if you compare it with Teams, it is still gold. If anyone asks me, whether I wulould rather join a voice chat on Slack or Teams, the answer is 100% of the time Slack. You never know what bugs Teams will cook up this time to ruin your call.

          O365 is a child's toy for non-professional document creation. Quickly hacking a document together maybe. Any normal non-web office suite blows O365 out of the water in terms of creating maintainable non-directly formatted documents. Now that they are deprecating offline Office (I think last release 2017 or so) their office suite has gone downhill at alarming speed.

          VS Codium is OKish, and has some cool features, but ultimately does not get close to what I have in Emacs in most aspects that actually matter, so that's not a convincing offer either.

          Their software is OK at best and rubbish in the average. And don't even get me started on Windows itself.

          • mynameisash a year ago

            A quick search turns up Office 2021[0], and I see no indication that offline Office is being discontinued.

            And claiming that O365 is a toy? Literally the only other online office suite I've used or heard of is GDocs, which has a fraction of the functionality that I use from O365.

            Your post doesn't sound serious at all to me.

            [0] https://www.microsoft.com/en-us/microsoft-365/p/office-profe...

            • fossuser a year ago

              It was clear to me when they mentioned Emacs that their opinion on this can just be ignored. That isn't to say Emacs isn't a great thing for a certain niche, but it's just not something that matters in this kind of strategic product discussion (and thinking it's comparable is just a signal it's not worth engaging imo).

              There are many reasons why VSCode is dominant and emacs isn't, similarly there are many reasons slack failed to compete (even after taking out a full page ad pretending to be apple before suing Microsoft). If you're going to pretend to be apple, you better be as good. If you're going to mock competition you better hope you're apple and not netscape [0][1].

              I too think Slack is a great product, but that's not enough. Slack needed to expand into a more competitive offering and they failed to do that.

              [0]: https://thehistoryoftheweb.com/browser-wars/

              [1]: https://www.huffpost.com/entry/slack-microsoft-open-letter_n...

            • zelphirkalt a year ago

              GDocs is a toy as well. Maybe worse than O365. If you only know those two, then I can understand your opinion.

              Try using offline office suites, best not MS Office, and use their writing tool. So many things are missing in these online office suites. I assure you the post is serious and I have used multiple online and offline office suites. Perhaps you should explore more and make use of styles and all that. You will quickly notice how many things are missing in the online office suites.

          • Kwpolska a year ago

            Desktop Office apps are still available, fully supported, and fully functional. Office 2021 is the latest buy-once release, but Microsoft 365 subscribers get all the desktop apps with their subscription (and they get continuous updates). The web apps are separate and have their limitations. Microsoft 365 subscribers get to choose between the two. Non-subscribers get only the web version with some features removed.

        • adzm a year ago

          Unfortunate about slack. It really seems to have stagnated. At least we have discord; I'm really glad they turned down Microsoft's 12 billion offer.

          • fossuser a year ago

            I think it's more about salesforce, I have a pretty negative opinion about them and everything they acquire seems to limp along or die (and the good employees leave immediately) - just seems like a boring place to be (nice sky scraper though).

            Microsoft has a better track record with acquisitions.

            • eloisius a year ago

              Salesforce wastes all their effort on new acquisitions making business-driven decisions like “add Einstein to it!” or “integrate it with the Lightning Experience (TM)!” instead of doing anything remotely useful to pre-acquisition customers. Once it’s adequately diced up, they can add it as another line-item to renewals who already pay so much money to salesforce that they don’t care.

          • toastal a year ago

            > at least we have Discord

            Poe's Law?

          • nicce a year ago

            Discord is yet another walled garden to be fair. It might be good replacement for some little, private communities.

            But also large communities are moving into Discord instead of using some public, index-able and findable platforms. They used to be public in the past and people very able to read information about them without going process for creating account, acceptance or even finding whole community.

            Discord is terrible for storing long term information in text form. People try to keep some pinned posts but no.. information disappears, when in comparison for forums it was there.

            And let's not start with privacy.

            • moffkalast a year ago

              > information disappears

              Wait really? I've never seen that happen on Discord, I can still see messages from over half a decade ago. Why would they delete data they mine?

              Slack on the other hand deletes everything immediately if you're not on the paid plan of course.

              • nicce a year ago

                > information disappears

                Main point was that you can't access or view it without creating the account at first, or even being able to find the whole server. You might even need to give phone number before you can view anything. Then wait 30min to verify that you are not a bot. And all this to find a solution for a problem which you might never return to. The search in Discord is kinda terrible.

                Also, if you are couple days or even hours late for the discussion, it might be too late to take part or continue it in large communities. Discord lives in the moment, it is not a place to store information.

                On the other hand, on forums, it is not a problem to continue some discussion couple days later. The quality is often much much higher.

                • moffkalast a year ago

                  Not sure what you mean, you just need one account for all of Discord, joining a new server is just one click away. Or do you make a new account for each server? That seems unreasonably hard to keep track of. I guess the search could be better but I've never really had too many problems with it and I've seen far far worse implementations. Such as Rocket Chat which has completely broken search. Literally never returns anything.

                  But yes chat is chat, you can't compare it to a forum style board because they're not trying to do the same thing. You do typically need both a Discourse and a chat server for any major organization.

              • veb a year ago

                Not _immediately_ I think it's after 10,000 messages they start to remove some.

                • anticensor a year ago

                  That 10k messages amount to less than a day in a highly active business.

            • fossuser a year ago

              +1 I’m hoping urbit can provide a solution to this, but it’s still not at the quality bar I want (though I think there’s a path to get it there).

        • moffkalast a year ago

          > lot of their products > GitHub

          Yeah that one doesn't count, they've made exactly zero changes since acquisition and Actions still has downtime every few weeks.

        • matwood a year ago

          > integration there (teams) is why they crushed slack despite slack’s headstart

          Teams is terrible software and only beat Slack b/c it is included 'free' in every o365 sub. GSuite is much better at collaboration than o365, but because of decades of MS lock in most people still need office to deal with interop.

          Slack recognized it was going to be hard to compete with a free feature from a product suite most companies are forced to have. SF could have been a good home, but they struggle at handling acquisitions.

      • Cipater a year ago

        It takes three seconds to cold start Word for me on an old (2015), underpowered (core i5, 8GB RAM, low end SSD) HP laptop.

      • bitwize a year ago

        You're complaining about 15 seconds? I can remember the Windows 3.x days when it took minutes to warm Word up.

    • daydream a year ago

      > When Microsoft executes well, they’re a scary force. They were exceptional under Gates and it looks like they are again.

      I agree with this. And what I think is so fascinating is how much they left on the table during this very same time. Steven Sinofsky's Hardcore Software substack is an amazing read[1]. And it really shows how much they got wrong as well as how much they got right.

      MS is firing on all cylinders. Both the OpenAI partnership and the GitHub acquisition are looking like genius moves right now. Google in particular should be very afraid.

      [1] Though quite long. Very, very long. But it's well worth reading all of it. There's untold numbers gems in there.

      • ignoramous a year ago

        > GitHub acquisition are looking like genius moves right now

        Surprising that AWS didn't compete for its acquisition. They stand to lose a bunch too. That said, except for OS and Browsers, it is all coming together for Microsoft, including their XBox division.

        • siva7 a year ago

          Google competed but they weren't willing to pay the price tag. Microsoft understood better how powerful Githubs potential was at the time. Github was likely Microsofts best acquisition in the last two decades.

        • DeathArrow a year ago

          > That said, except for OS and Browsers

          Windows is not negligible.

      • LawTalkingGuy a year ago

        I agree that MS is executing well, but why would Google be afraid?

        There's no AI moat keeping anyone from replicating what MS is doing. They can train on everything on GitHub just like MS can.

        If anything, this is where they should both be afraid because an upstart with a browser extension could replace them.

        • daydream a year ago

          > I agree that MS is executing well, but why would Google be afraid? There's no AI moat keeping anyone from replicating what MS is doing.

          It's not an AI moat. It's about business models.

          If MS takes even a small fraction of Google's search traffic, and/or convert a small fraction of people from Chrome to Edge, it'll hit Google's bottom line at the one place that's a huge majority of their revenue and profits. MS has nothing to lose and Bing GPT4 integration is very good.

          Flip side, there's no area where Google is significantly threatening MS. Office suites: MS copilot announcements go significantly further than Google (Business Chat). Cloud: Azure's relationship with OpenAI and rapid product expansion could see it gain share.

          MS is moving very quickly and coordinating across the entire company. Google is moving slowly. And company-wide execution? Well. That has never been their strong suit.

        • gkbrk a year ago

          > An upstart with a browser extension could replace them.

          Google regularly bans or breaks browser extensions that are inconvenient for their profits.

          They also control the main app store for Android devices.

          And for apps installed externally, they have a backdoor to remove any app they want from Android devices that they claim is for malware.

    • quijoteuniv a year ago

      Masterstroke is that they are including Linux, in WSL also in windows server. They are 1 step away from being a Linux distribution that runs legacy windows software within containers :D

      • dathinab a year ago

        or actively remove many reasons why people ended up trying out linux distributions in the past

        like e.g. universtity students needing to run programs which don't run on windows natively all the time, so many try out native Linux distros, some stay. Now all of them can just use WSL. Or like devs which need to develop for Linux servers etc.

        Basically they have accepted that windows server have failed and their server license business model isn't that good anymore too due to how the cloud changed things. So instead of pushing for a Windows everywhere ecosystem they now embrace Linux on servers (preferable on Azure ;) ) and Windows on the desktop using WSL to bridge the dev experience and also bridge to university student use case.

        But you can be sure that if they see a way to make it harder to install Linux on systems and get away with it both legal wise and PR wise (probably using some excuses about "security") you can be sure they will do so. Especially if they can push the blame onto others (like the hardware vendors not implementing some option in the BIOS which is needed to allow other OSes to be installed). Through at least for now I expect them to act careful to not damage their new image.

        Also one way the failed the "Windows desktop" thing is by producing a pretty bad out of the box desktop experience for many people (like I'm fine paying for an OS but not if there is even a single AD in there, or bloat ware). This create reasons for people to switch to Linux which had been much less common during windows 7 days.

        • fossuser a year ago

          I use macOS in large part because its a *nix OS under the hood.

          WSL gives a lot of that power to Windows.

          • fesc a year ago

            Except it isn't, so e.g. running containers still need a VM.

            I wish macOS had something like WSL built-in.

          • dathinab a year ago

            macOS is tempting but it's a pretty bad nix IMHO.

            The amount of times I had to change ad-hoc scripts to work around macOS limitations is absurd.

            • fossuser a year ago

              Yeah it’s a tradeoff, that part is worse but everything else is a lot better (battery life, performance, hardware quality, software integration, design etc.)

            • eddyg a year ago

              That’s likely because macOS is BSD-based. (Which may be inconvenient at times, but it doesn’t mean it’s “bad”.)

              Not all the world is GNU.

        • DeathArrow a year ago

          > Or like devs which need to develop for Linux servers etc.

          I develop for Linux but I use WSL only indirectly through Docker Desktop.

        • DeathArrow a year ago

          > like I'm fine paying for an OS but not if there is even a single AD in there

          I can't remember last time I paid for Windows. Microsoft usually offers free upgrades.

          I haven't seen any ad on Windows. I frankly find hard to believe there are ads in Windows.

          • dathinab a year ago

            > I can't remember last time I paid for Windows.

            Every time you buy a new Laptop or pre-build computer with Windows installed you (very likely) implicitly bought a license.

            But I agree that due to Windows handing out a lot of "free upgrades" even outside of their official supported upgrade path they missed out on a lot of License cost, but made more users upgrade so probably worth it.

            Most important Microsoft mainly cares about Businesses buying Pro versions of licenses, potential in huge batches.

            > ... disabled ads ... [from other adjacent comment]

            Or uses a pi-hole or a software which disables them for you but which you might have installed to e.g. set privacy settings or replaced components with 3rd party ones or that LTT Linus ability to subconscious filter out ads. What matters is that there are a lot of people which have had the AD experience.

          • ipaddr a year ago

            Sounds like you are on an older version of windows or maybe you disabled them in a drunk rage.

      • wankle a year ago

        I'm a long time Ubuntu user but, if Microsoft went Linux, and it wasn't an anti-VM (explained: try to upgrade Win 10 in a VM to see what I'm talking about), Ad infested, user tracking machine then I might have to get back in bed with Microsoft.

        • ajolly a year ago

          That's no longer the case for Windows 11, or if I remember correctly later versions for Windows 10.

          • wankle a year ago

            I just checked, Windows 10 in a VM (running on KVM on Linux), refuses to upgrade to Windows 11. I only keep it around in case a contract requires something regarding Windows. 10 goes out of support next year, I will delete the VM. I can reject any clients that would need me to touch windows.

      • withinboredom a year ago

        If only they could get ipv6 working in WSL so that shenanigans[1] aren't required.

        [1]: https://github.com/withinboredom/ipv6-wsl

        • quijoteuniv a year ago

          Shenanigans?! Do not get me started, i have for a year remember to close any WSL session before put pc to sleep, command line or VSC because on waking VMEM process will eat up the CPU. So had to restart or do a weird workaround, but Linux session was gone anyway. There is a github case 2-3 years old

    • mattferderer a year ago

      Not saying Ballmer was great or bad but it could be argued that the US Government handicapped Microsoft more than Ballmer was a bad CEO. To my understanding Gates was also very active with the company during those years as well.

      Ballmer inherited a company at the top with everyone wanting to crush it & make it dissolve.

      Nadella inherited an underdog.

      I think this should be considered when thinking about the legacy of the CEO.

      • fossuser a year ago

        I don't buy these excuses for Ballmer.

        - Massive miss on mobile

        - Delayed cloud stuff because of obsession with windows

        - Delayed cross platform apps because of obsession with windows for same reason

        https://stratechery.com/2018/the-end-of-windows/

        > "That memo prompted me to write a post entitled Services, Not Devices that argued that Ballmer’s strategic priorities were exactly backwards: Microsoft’s services should be businesses in their own right, not Windows’ differentiators. Ballmer, though, followed-through on his memo by buying Nokia; it speaks to Microsoft’s dysfunction that he was allowed to spend billions on a deal that allegedly played a large role in his ouster."

        I think he ultimately held them back actively with bad strategy and things only started getting better when he was finally gone. Nadella inherited an underdog because the previous decade's decisions caused them to become one.

    • passion__desire a year ago

      As a young developer, I always wondered why Google didn't buy Java from Sun. Wouldn't it be a strategic buy?

      • airstrike a year ago

        Not sure what to make of that, but I'm just glad we don't live in a world in which Google + Java is the winning horse

        • nunobrito a year ago

          cough Android coughh

          +2.5 billion Google Android devices running their apps overwhelmingly in Java.

          Wasn't a coincidence this was chosen that way.

      • mistrial9 a year ago

        google is literally the next generation Sun, many of the same people were involved.. the Sun Micro company was scrapped for (valuable) parts, with a lot of internal competitive moments

      • letsdothisagain a year ago

        Google wasn't a big player when Oracle bought sun. Don't ever underestimate them. There is a reason Oracle of all companies is the government's pick to annex Tiktoc.

    • intelVISA a year ago

      I'm first to crap on MS but they've really pulled it together recently with a lot of big plays coming to fruition. I don't use any of their tech but I can respect what they've done as a business.

    • bredren a year ago

      If it were not for the impending headset, Apple would look a wee bit caught out as well.

      I suspect once Apple turns toward gen ai, all of its other advantages (infra, hw, instal base, services, etc) will propel it to forefront.

      • fossuser a year ago

        Apple has always had its own hardware based silo to some extent (excellent products and design, fully integrated).

        Still, I suspect the AI stuff will be hard for them. They were first to market with Siri and it’s still trash 13yrs(?!) later. It’s always been what they’re weakest at, I’m not sure LLMs will be different for them.

        The headset could still give them a nice hardware platform advantage though if that UX becomes the main new interface.

        • bredren a year ago

          AI in support of user interface has been downright awful. IIRC, Siri has the lowest customer satisfaction out of any other product at Apple.

          However, the company has deployed machine learning in support of its neural engine which has a huge amount of penetration. This has already shown the iOS fleet ready to do gen AI at the edge.

          The company has focused largely on STT and image processing but has worked to support use of the chip for general ML, via transformer. [1]

          I’d say Apple’s only failed in the way everyone did—-failure to foresee and the potential impact of generative AI.

          Apple seemed unable find a use case that would help its ecosystem. I am surprised it never releases copilot like behavior to Xcode. When will this occur?

          [1] https://machinelearning.apple.com/research/neural-engine-tra...

          • pantulis a year ago

            > I’d say Apple’s only failed in the way everyone did—-failure to foresee and the potential impact of generative AI.

            This is so true that now the only differentiating capability is execution, and I'd say that Microsoft is excelling here in an unexpected way.

        • zer00eyz a year ago

          > Still, I suspect the AI stuff will be hard for them.

          They bought Siri! What stops them from buying someone for "ai"

          • fossuser a year ago

            Siri still sucks?

            • withinboredom a year ago

              "Siri, set an alarm for 5pm eastern standard time" ... you get an alarm for 5pm local time, with the memo 'eastern standard time' (assuming aren't located in eastern standard time, this is not the expected behavior).

            • shagie a year ago

              Siri isn't supposed to be great - its a voice interface to a few standardized interfaces that applications can hook into.

              The system provided intents ( https://developer.apple.com/documentation/sirikit ) are rather limited. While we can say "it should be more" the architecture for it doesn't appear to be something that easily extensible by anyone (Apple included).

              However, for those intents - it does quite well and most processing is done on device rather than in the cloud and that fundamentally changes the economics and capability. It is much less expensive than Alexa to run (where nearly everything is in the cloud), but it is also something that can't do as much.

              Try this - turn on airplane mode and do "hey Siri what time is it?" or "hey Siri open notes" And while those are indeed a very limited examples (there are other examples such as interacting with HomeKit where it needs the lan), it shows that much of the work is done on the phone.

              This also means that its capabilities are limited to what you can run on the phone.

              • fossuser a year ago

                That feels like a retroactive explanation of the current status quo to me. I doubt Apple would agree "Siri isn't supposed to be great" - they just failed to live up to what they wanted.

                We'll see if that changes.

                • shagie a year ago

                  The "what they hoped" is a question of "what who hoped?"

                  When you look at SiriKit when compared to Google and Alexa, it is an entirely different approach that isn't designed for general tooling of a voice assistant but rather the intents show that it is designed for specific functionality of specific types of applications.

                  Asking Siri for things outside of those intents was always delegated out to some other service (Wolfram Alpha was the choice for a while).

                  Siri was never designed to try to monetize the voice interface (compare Alexa and Google) and thus wasn't trying to do everything and SikiKit shows that it can't do quite a bit. So that it can't do everything shouldn't be a surprise to Apple.

                  Comparing Siri to Alexa, they are very different architectures with different goals and support costs.

                  If you look at https://www.apple.com/siri/ you should get the idea that this is interface to common tasks - not a general "do everything and chat about it" assistant. What's more, it limits what goes off device (whereas Alexa and Google do all speech to text on the cloud).

                  • fossuser a year ago

                    Just watch the introduction of the Siri product: https://www.youtube.com/watch?v=agzItTz35QQ

                    "Your intelligent assistant that helps you get things done just by asking"

                    I suspect Scott Forstall and Jobs wanted it to be what LLMs show the potential for it to be. Not the crappy barely functioning timer setting app it currently is.

                    Siri is dumb as rocks, it's so bad at basic queries it's not worth trying to use.

                    • shagie a year ago

                      The tooling that it was designed for was things like https://youtu.be/agzItTz35QQ?t=709

                      Those were the intents that were set up.

                      Yes, it would be nice to have LLM style power - but that isn't how Siri was architected even from the very start. Word combinations are recognized as certain intents and parsed for functions to call into apps that register that they are able to handle that function call.

                      If there was no match for the intent, it was sent to Wolfram Alpha to do a knowledge base lookup. While Wolfram is really good, it certainly isn't a chat bot.

                      Siri wasn't supposed to be smart. Siri was intended for an interface to the existing apps of phone, music, messages, calendar, reminders, map routes, email, and weather.

                      When you look at that segment, 12:48 "just take your phone and ask Siri to set a timer for 30 minutes and you're done." Siri was very much intended as a timer setting app.

                      What functionality in there that you see in this segment that isn't designed as an interface to existing apps? What time cue do you see them promising something smarter than what was designed?

                      • fossuser a year ago

                        They open with what they want it to be “intelligent assistant” and then show the capabilities it currently had.

                        Since that time in 2011 it has gotten no closer to “intelligent assistant”. That’s a failure imo.

                        Google’s is much better at answering basic queries.

                        • shagie a year ago

                          Google's voice assistant is an interface to google cloud applications.

                          Siri is an interface to a limited set of the apps on the iPhone.

                          These are fundamentally different architectures for how how each was designed along with implications for privacy and where the company has compute resources that can be used.

                          If you are after a general knowledge search engine, Google will certainly out preform Siri.

                          Google has better cloud integration for a lot of their functionality. Apple doesn't have that amount of cloud resources that it can use and is a device first company rather than a cloud first company.

                          If you want to say "android can search google better" Ok. I'll grant that. If that makes it "smarter" - ok. Android is smarter than Siri because it can search google better.

                          If you want to say "android can control apps on its phone better" - I really want to see evidence that the ability for Android to control 3rd party music apps (e.g. Spotify) or report the weather or calendar or set up alarms... I don't believe that android is any better than Siri in that regard.

                          If you want to chat with it (e.g. "what is the answer to life the universe and everything?") then those are cute responses that are programmed in.

                          Do you have other criteria that you are using to compare the different devices other than its ability to search Google?

                          • fossuser a year ago

                            You can adversarially frame "ability to intelligently answer questions" or act as an "intelligent assistant" to "ability to search google" but those are different things. I'm comparing the use case of the former as an end user.

                            Siri will often uselessly "find results on the web and send them to my iPhone". Google can answer more queries directly as well as do basic timer setting stuff. The thing you're missing is the implementation detail is irrelevant to the end user, it's an issue of capability.

                            Your point about Apple being worse at cloud is partly what I'm talking about (and one of the reasons siri is so much worse). It's why Apple has not done a good job with this up until now. I don't buy the "it's intentionally not capable/bad by design because that's what they were going for" argument.

                            My prediction is Apple will make some sort of move here. Whether that's an investment in stable diffusion or something else I don't know. I expect what they do to have an apple flavor (on-device, privacy focused), but I think it will be leveraged to make Siri actually useful (and more of an actually intelligent and capable assistant).

                            • shagie a year ago

                              My advice would be "don't expect them to do anything with LLM or similar so that when they don't, you won't be disappointed."

                              Expecting Apple (or any company) to be chasing the current hype is more likely to be disappointing (see Google and Bard or Bing and its mistakes). Apple, with its very cautious nature for curation of its brand image would likely be some time out.

                              I would also point out that Apple's prominent place in regulatory views would make it more hesitant to do things that they may have to open up.

                              Wait until after the regulatory dust has settled... and after the various lawsuits about copyright infringement or section 230 and GPT have settled ( https://www.marketplace.org/shows/marketplace-tech/chatgpt-i... ).

                              I don't believe that Apple has any appetite for becoming more of a target for government regulators or wading into untested legal waters. But that's my crystal ball - yours apparently sees different things.

                              • bredren a year ago

                                I agree w parent that the end user is who matters--Siri is just not very good at answering what seem like basic questions.

                                What made google amazing was it settled conversational disputes or provided instant (if limited) familiarity on a subject. Siri fails to provide verbal feedback on relatively simple questions, instead referring people to their iPhones for "web results."

                                As an end user, the product's failure understand or make sense of the intent of a user is even harder to deal with in Home / HomeKit. I often find myself pulling up the Home app to hunt down and manually operate some accessory because voice requests are just failing.

                                Common patterns happen throughout a home covered in HomeKit and Homepods and yet this AI is unable to provide reasonable suggestions for automation modifications, scene tweaks or suggestions for additional accessories.

                                Siri-based requests for songs or albums from Apple Music on HomePod is abysmal, providing covers, or flat out wrong genre, wrong era that my listening habits should well weight away from.

                                It is just bad--architecture design be damned the product fails under "normal" use. Outwardly, it seems like a MobileMe-level failure, where SJ asked at a town hall "Can anyone tell me what MobileMe is supposed to do?"

                                All that said, I agree with this comment that it is a mistake to expect Apple to integrate LLM that uses any known model into its product.

                                Even if Apple wanted to, I don't know where the company could source data that is manicured to "safe" enough to serve as a basis for responses by Siri.

                                It doesn't really matter, to end users how they fix it.

                                The company's job is to drop the product or iterate until it figures out how to better satisfy they end user.

        • eitland a year ago

          > Still, I suspect the AI stuff will be hard for them. They were first to market with Siri and it’s still trash 13yrs(?!) later.

          Long time Android user here, only iPhone since 3 years ago:

          Siri I use to set timers and I sometimes use its unsolicited suggestions.

          Google only tried to make practical jokes on my expense, like suggesting I call the customers CTO or text a friend-of-a-friend at 0400 in the morning.

          One single time I can remember Google actually getting a suggestion correct.

          This probably works better today than four years ago and maybe it always worked better if you were in a US timezone and spoke American English, but with Siri setting timers at least works and a few times a year it comes up with smart suggestions.

          (Yes, I'm not too impressed with Siri either.)

        • insane_dreamer a year ago

          Really surprised Apple didn't buy OpenAI; they will regret that MSFT did so.

          • ugh123 a year ago

            OpenAI was already in bed with MSFT when they started working on Codex/Copilot together a year or so back. So the synergies were already there when OpenAI needed to scale up ChatGPT for the masses using Azure infra

          • bredren a year ago

            I think apple made a choice a while back to yield software engineering workflows outside Swift to Microsoft.

            I had expected Apple to produce its own version of GitHub, but the space is apparently too messy.

            The earliest value of open AI is similarly messy. Lots of press about how to jailbreak or trick the thing. Bad answers or questions about legality of what the models produce.

            None of that comes anywhere near where apple can use its strengths. It just opens new surfaces that require expertise the company doesn’t have.

      • listless a year ago

        I'm curious on this as well - although a VR headset at this point feels like a foul ball straight backwards and into the net.

        Apple is good at a lot of things but it is AWFUL at AI. Siri is still the worst experience you can have with a language model. That said, all they need to do is integrate some OpenAI API's. In which case, Microsoft STILL wins because of its investment there and where OpenAI's enormous compute happens - which is likely in Azure.

        • chipotle_coyote a year ago

          Siri isn't a language model, AFAIK. I'm sure it's using machine learning on the back end, but it's doing something very different.

          The thing is, a voice assistant needs to actually perform Natural Language Understanding -- it has to understand the intent of what's being said to it. That's actually a very different AI problem than what LLMs are doing at this point. Samsung's Bixby -- yes, the one everyone turns off -- can handle a "conversation" like this: (This is run using the "Space Resorts" sample capsule, a fictitious hotel booking system)

          ---

          User: Show me hotels near Mars

          Bixby: I found these space resorts on Mars. (Shows two, “Martian Oasis” and “Space Y”)

          User: [taps Martian Oasis] make a reservation for Christmas week

          Bixby: Which habitat pod would you like? (Shows two)

          User: Honeymoon

          Bixby: Are you sure you want to book this trip? (Shows the hotel, “Number of astronauts: 2”, and the dates for Christmas week)

          User: change the astronauts to 4

          Bixby: (Shows the sam confirmation screen, with the number changed)

          User: Yes

          Bixby: I hope you have a great time at Martian Oasis. (Shows booking info)

          ---

          Now, in some ways this is no great shakes, but I don't think "all [Apple] needs to do is integrate some OpenAI APIs," because they don't have an API for NLU and deriving intent. GPT4 et. al. aren't currently designed for this kind of problem space. You could use them to make Siri (and Bixby and Alexa) better at answering general questions, but you can't use them to make the voice assistant do something, just respond to something.

          Bixby 2 (the current version) is, incidentally, what came out of Viv, the voice AI startup that the Siri folks made after they left Apple.

      • endtime a year ago

        I don't get the impression Apple ever caught up with Alexa or Google Assistant. That said, this is a bit different, since the AI tech is a bit more commoditized than with those voice assistants, and the product problem is the hard part. On the other hand, Apple has positioned themselves as extremely privacy-respecting - I wonder if that will affect their ability to use their users' data to train models.

        It'll be interesting to see, anyway.

      • pxoe a year ago

        a headset is not a suit of AI products. it's kinda...irrelevant. meta has a headset, and that's working out great for them, isn't it.

        so far, there's nothing. there's only vague rumors, or not even - just assumptions that "surely apple will do something about this", "they'll turn to ai eventually". well, they have siri. which doesn't exactly inspire hope for much.

        • DeathArrow a year ago

          > meta has a headset, and that's working out great for them, isn't it.

          A previous VR wave failed in the '90s. If this one fails, too, we' ll see another in 20 years.

      • newaccount2023 a year ago

        for now, Apple is in a different league

        if Tim Cook wants $10 bln in new profits, he can simply pass some arbitrary judgement like "the next iPhone will only be compatible with new AirPods we will introduce at the same time" and everyone will comply

        • DeathArrow a year ago

          And that's why I don't use their products.

    • neom a year ago

      Spot on. Microsoft are a joke when they're losing and a joke when they're winning. I'm old enough to have seen the full circle and I'm super curious to see what happens when M$ is back on top.

      • dathinab a year ago

        - lock down of PC software to the Microsoft app store, probably roughly at the same time apple also tightens a lock down on macOS

        - more anti competitive steps to make installing desktop Linux on consumer hardware hard while arguing you could just use WSL and it's "for your security" and they are not anti Linux because they embrace Linux on servers

        - trying to kill Valve, there should only be Windows games and only through the Windows app store

        - maybe retrying mobile if Google doesn't do that well

        - probably even more anticompetive email nonsense in the sense of "who needs emails providers beside a few giants (like gmail)"

        - probably trying to make proper (by then) modern 2FA not work on linux due patend or drm issues, trying to make it hard to log in anywhere in the web with a native linux system

        • trympet a year ago

          > lock down of PC software to the Microsoft app store, probably roughly at the same time apple also tightens a lock down on macOS

          Nothing is locked down. UWP and win32 are funcionally equivalent with sparse registration. If you decide to publish to the store anyways, you can even use your own payment provider and keep 100% comission.

          > more anti competitive steps to make installing desktop Linux on consumer hardware hard while arguing you could just use WSL and it's "for your security" and they are not anti Linux because they embrace Linux on servers

          Isn't WSL a win-win for the Linux-community and Windows-users alike? WSL2 and WSLg make Linux desktop apps more accessible than ever.

          > trying to kill Valve, there should only be Windows games and only through the Windows app store Xbox Game Studios games are released on Steam.

          I could go on, but you get the point.

          • dathinab a year ago

            it's not about what is done now, it's a response to the parents comment of "what will happen if microsoft becomes much more powerfull and market dominant again"

            i.e. it's all hypothetical speculations about the future under a given assumption of development which might never happen

            > Linux desktop apps more accessible than ever.

            yes but it also removed the need to use Linux and could be extended with features Linux doesn't have => i.e. it could be used for the good old embrace, extend, extinguish strategy Microsoft loved to use in the

            • CoolCold a year ago

              > yes but it also removed the need to use Linux

              And is it a problem for user? Or anyhow a bad thing? Why not advocate for FreeBSD for example?

        • DeathArrow a year ago

          Do you honestly think that Microsoft is scared of Linux taking over the desktop?

          • dathinab a year ago

            It never was about emotional things like being scared.

            It's about eliminating competition and especially preventing potential future competition, it's about maximizing control and power to the most you can without losing otherwise (due to e.g. law regulations lost consumer trust etc.). It's just a game of numbers and future prospect.

            Just the fact alone that Valve has Linux as a form of "escape hatch" if Microsoft locks things down more. With a bit of "future potential" (Valve Gaming console which is more then "just" the Steam Deck) is enough for Microsoft to take actions like that from a purely calculative perspective.

            And Linux Desktop _has_ future prospect, maybe not in context of how currently most Linux desktops are but prospect anyway.

            I mean Linux desktop has a lot of additional challenges:

            - like _massive_ fragmentation through every layer of components and users, often with a lot of more emotional then technical opinions

            - much smaller financial resources etc. (the companies which invest are either small (e.g. System76) or are not focused on desktop Linux (e.g. Canonical, Red Hat, Valve))

            - a lot of money is flowing into server Linux hence all decisions tend to be focused on the server aspect thing things which are negative for desktop Linux and can not be configured away. (Through also a lot of "accidental" improvements and maintenance.)

            but even with all this challenges Desktop Linux is quite usable, actually for some people _more_ usable then Windows. The main problem is normally not missing functionality or hardware support but fragmentation. Fragmentation making it a bad deal to support Linux as a software vendor (getting better through Valve, Flatpack and Snap), fragmentation wasting dev resources, fragmentation making system management/hardware support harder, etc. As well as there not currently being too much monetary reason to invest into 1st party desktop Linux support (Valve is a special case, System76 is small).

            But non of the reasons which make people not use Desktop Linux are fundamental, and under the right conditions _one specific_ Linux Desktop could become highly successful. It's very unlikely but it's still possible, hence there is reason to make sure it's not possible.

            • DeathArrow a year ago

              Microsoft already owns the desktop and I don't think they consider Linux a threat.

              macOS would be a a better contender but I don't think they consider that a threat, also.

              In fact, Windows is starting to weigh less and less in overall Microsoft strategy. Windows mattered when all we had were desktops but now they have lots of cows they are milking.

              I won't be surprised if in the future Microsoft will find that it isn't worth to pay for the development of Windows and will release it's own Linux distribution. It would be quite a disappointment for me, but certainly possible.

              > but even with all this challenges Desktop Linux is quite usable, actually for some people _more_ usable then Windows.

              I've tried since 23 years ago to use Linux as a desktop but it failed repeatedly. I even have it installed on a separate disk on my home PC but I seldom boot it.

              • dathinab a year ago

                > they consider that a threat, also.

                In the hypothetical scenario from above it doesn't need to be a thread, just having the potential to become one can be enough. I mean why shouldn't you remove potential future threads when you easily can do so now but can't once they become a thread?

                > I've tried since 23 years ago to use Linux as a desktop but it failed repeatedly. I even have it installed on a separate disk on my home PC but I seldom boot it.

                and other do not fail, I know even some quite non technical people who do use it daily and there are docents of steam deck users, too

                • DeathArrow a year ago

                  I am glad Linux desktop didn't fail for others. Maybe they enjoy fixing broken stuff, configuring things repeatedly and tinkering with it. And maybe they don't miss software that doesn't run on Linux.

                  I am busy and I want to use software, not tinker with the OS. The OS should just work. And since software I use both professionally and personally doesn't run on Linux, it wouldn't bee good for me even if the experience was smooth.

                  On the server side I use Linux a lot. Most stuff I develop runs on Linux.

    • computerex a year ago

      No one is talking about how they quietly acquired Nuance. If it weren't for Google, Microsoft would have a complete monopoly on AI.

  • ilovetux a year ago

    > have to wonder at what point developers remember how anti-developer Microsoft used to be

    Microsoft has historically been anti-opensource, but not anti-developer. Their first product was a BASIC interpreter and in my experience throughout the 90s and early 2000s their developer ecosystem (aka Visual Studio) has really been first-class.

    I am not a fan of Microsoft because they have been openly hostile to open source, but I don't think it's fair to say they have been anti-developer.

    • erikstarck a year ago

      "Developers! Developers! Developers!"

      He wasn't kidding.

      • oefrha a year ago

        Brings back the fond memories of Ballmer and his sweaty shirt.

    • cornholio a year ago

      There is some nuance here. Microsoft executed "commoditize your complement" to perfection - the complements to their OS being PC hardware and applications.

      This meant they had to be insanely good at supporting a vast array of diverse hardware, but also offered exceptionally good support for developers to keep the barriers of entry low in the Windows software market. They had even a cute name for these commoditized and neutered competitors - "ISVs". Basically, Microsoft owned the OS and the major applications like office & enterprise software, media, browser etc. and everything else was supplied by an ISV, for example your accounting software for country XYZ, a market where MS had no interest in entering.

      As long as you kept within the ISV playground, MS was developer friendly, but it would turn very hostile to any perceived competitor to their core assets. Undocumented APIs, monopoly abuse, dark patterns, the entire circus. This strategy made the PC market impenetrable for nearly two decades, and it was only through sheer luck and complacency that the mobile revolution caught them on the wrong foot.

      • DeathArrow a year ago

        > it was only through sheer luck and complacency that the mobile revolution caught them on the wrong foot.

        Windows Phone wasn't bad and I wish we could have more competition in mobile space.

        • cornholio a year ago

          I wish we could have open platforms that competitive players can extend and develop without owning outright and excluding other competitors.

          It's the great next step in regulating monopolies, contemporary products no longer exist standalone in the marketplace but must always interoperate with existing infrastructure and platforms. The last decades of tech competition were a repetition of this basic tune, some first mover more or less stumbling into a de-facto standard and then fighting like hell to maintain its dominance and undeserved rent extraction.

          The situation is complicated by the nature of international trade vs local regulations, it might not be good for your national consumers to be fleeced by a platform monopoly, but you more than make up for it if your national tech champions achieve world-dominance.

          • DeathArrow a year ago

            > The last decades of tech competition were a repetition of this basic tune, some first mover more or less stumbling into a de-facto standard and then fighting like hell to maintain its dominance and undeserved rent extraction.

            Isn't that true for every industry?

          • DeathArrow a year ago

            > I wish we could have open platforms that competitive players can extend and develop without owning outright and excluding other competitors.

            Who will pay for the development of those platforms? Who will ensure the compatibility with hardware?

            • cornholio a year ago

              The development costs for Windows (as a common platform, stripped of any add-on that could be provided by other competitive players) are a footnote in the costs of the global software market for Windows. There is indeed a complex problem to be solved of how to organize an open platform so that the development costs are paid, but it's not fundamentally a question of resources. Multiple companies and open source teams have achieved roughly similar feats with many orders of magnitude less resources than what MS rent-extracted from the Windows OS consumers.

              Ditto for hardware, you need a well defined and stable interface and the vendors will adhere to it once there is critical mass.

              > Isn't that true for every industry?

              In almost every such historic example that still exists in some fashion today (railways, energy, telco, utilities etc.), there was strong regulatory action to break up monopolies accompanied by rigorous standardization of the common interfaces.

              I don't have a problem with competitive players inventing industries and new ways of doing things for the allure of monopoly profit. Just with the sluggish regulatory action in the particular case of computing tech, well past the point where it has become an established and essential resource for society.

              • DeathArrow a year ago

                > The development costs for Windows (as a common platform, stripped of any add-on that could be provided by other competitive players) are a footnote in the costs of the global software market for Windows. There is indeed a complex problem to be solved of how to organize an open platform so that the development costs are paid, but it's not fundamentally a question of resources. Multiple companies and open source teams have achieved roughly similar feats with many orders of magnitude less resources than what MS rent-extracted from the Windows OS consumers.

                I was talking about Windows Phone, a mobile operating system. To have an open platform, you have to have one big company paying and managing its development. Otherwise you end up with fragmentation, lack of ABI stability, backwards incompatibilities and "distros" like with Linux on the desktop, and that is not exactly a consumer success.

                If you want an open platform, you have Android. But apart Huawei (which was forced by US government) and maybe some few small manufacturers from China, I don't see much competition in the space.

                Contrast this with a parallel world in which we have Android, iOS, Windows Phone, WebOS, Bada, Tizen, Maemo, BlackBerry etc.

    • ensignavenger a year ago

      At this point, it seems MS has almost as many years of being neutral-supportive of FOSS as they have of being against it.

    • bastardoperator a year ago

      Is there anyone doing more for open source today? I hear you, but it's clear they've changed that tune dramatically.

      • throwaway290 a year ago

        By monetising OSS and sidestepping copyleft licenses to suggest you the same code repackaged as part of proprietary autocomplete for which you pay MS and not original authors?

        • bastardoperator a year ago

          [flagged]

          • throwaway290 a year ago

            Sounds like you could get a refresher on how it works, and regarding your ad hominem you are clearly using your real name just as much.

    • lancesells a year ago

      Many people paid to host their code at Github to only then have it sold back to them and others in the form of Copilot.

      IMO that's very anti-developer.

      • vincentkriek a year ago

        That is not anti-developer, but definitely anti-opensource. As a developer I dont really care my code is used to make copilot better and sold in that way.

        • bhj a year ago

          You should care if your code's license is being violated by being reproduced without attribution. Undermining OSS's licenses can ultimately weaken it, and the cynic in me suspects MS is fully aware of this.

    • mistrial9 a year ago

      the economic contracts offered to developers by Microsoft were very different than in other development ecosystems. Microsoft always represented a different economic culture than many others, developers chose their allies based on multiple criteria.

    • PurpleRamen a year ago

      Microsoft was also a long time anti-commandline and against scripting, they tried to made everything clickable, with wizards and s**, and remove text and keyboard-input as much as possible for admins and devs.

      Thinking about, Copilot is in it's own way, a continuation of this, just more dev-friendly.

      • bradford a year ago

        > Microsoft was also a long time anti-commandline and against scripting, they tried to made everything clickable, with wizards and s*,

        I'd categorize this as incompetence, not malice towards developers.

        Specifically: Microsoft thought that code-creation wizards and UI would offer a better story than command line and text. Those attempts were misguided, and MS adjusted. The command line culture at MS has been pervasive for a long time, despite the quirkiness of DOS, so I must object to your categorization of MS being anti command-line and anti-scripting.

        [disclaimer, MS employee, my opinion only.]

      • klibertp a year ago

        Yeah, but then they made PowerShell, which was at the time seriously the most advanced shell and shell scripting language out there. PS got pretty mixed reactions because it was different than BASH, but the idea of typed pipes of objects instead of one-size-fits-all streams of lines of text was (and still is) powerful. (The syntax could be a little less verbose though)

      • DeathArrow a year ago

        Being a developer isn't the same as being a command line user.

        Most Windows users prefer GUI over the command line.

        That being said, Microsoft released PowerShell, Windows Terminal and lots of command line tools. A large part of the Windows administration can be done trough command line if one so desires.

  • brundolf a year ago

    Public companies aren't people. It doesn't make sense to trust them, but it also doesn't make sense to hold a grudge against them. They act in a way that fits the moment. Right now, Microsoft seems to be in a "build good tools" moment. If they get too dominant they could re-enter an "abuse power" moment. But that would be because it's what they think is in their best interest, not because "Microsoft is [uniquely] untrustworthy"

    • chatmasta a year ago

      That may be true, but for many of Microsoft's flagship open source projects, it is very much about the people. For example, TypeScript has been created and maintained by a relatively stable team of highly proficient developers who have become the face of the project and in many ways the driving force behind its success.

      It's important that Microsoft backed them, because it provides a stable environment for the developers to keep working on the project and gives confidence to the community that it's a stable language to adopt. But fundamentally it's the people who are important.

      In recent years, Microsoft has shown there can be real upside to corporate-driven open source, especially when it has proper buy-in from management and usage throughout the organization (e.g. VSCode is developed in relative lockstep with TypeScript, and both projects benefit from that relationship).

    • cmrdporcupine a year ago

      Yep. But also this kind of essentialist thinking doesn't work for people, either, so :-) Nobody is just one thing or another. And neither are companies. They act on various levers, in response to various interests, and it's about looking at the tendencies and patterns. Holding grudges against people makes little sense, either.

      • brundolf a year ago

        Fair, although even on a sliding-scale I think trust mechanisms are less-inaccurate when it comes to a single person (who can change, but usually in gradual/limited ways) vs a company (which is a revolving door of thousands of people who come and go over the decades)

  • _the_inflator a year ago

    MS was smart enough to use the existing brands to somewhat whitewash their previously not so well perceived reputation in the developer community.

    I guess some folks at Google will regret not buying Github. On the other hand, Google is in my opinion not the best choice in regards to product development and integration. Their business model centers around ads, while Microsoft has the better stance here with a subscription model.

  • tarruda a year ago

    When you use copilot integrated in the editor, does microsoft collect all source code data on your project or only context used to perform the completions?

    • SparkyMcUnicorn a year ago

      When you sign up for CoPilot, there's a settings section on Github for it. One option you can toggle is "Allow GitHub to use my code snippets for product improvements *".

      Context still needs to be processed, so surrounding line, block, and a couple open tabs gets piped into the prompt.

      And here's a quote from the privacy page.

      > Depending on your preferred telemetry settings, GitHub Copilot may also collect and retain the following, collectively referred to as “code snippets”: source code that you are editing, related files and other files open in the same IDE or editor, URLs of repositories and files path.

      • belorn a year ago

        What is the default?

        • skripp a year ago

          To share. But it's a very obvious check-box.

    • jghn a year ago

      Concern over this is the #1 reason I have not yet tried to use Copilot. For my hobby projects I don't care enough to pay for it. And if it's phoning home proprietary code, I can't allow that to happen.

      • vintermann a year ago

        Eh, as long as my employers don't care (they don't), I don't care. I have no illusions that my code/our code will give Microsoft any valuable training data it couldn't trivially get elsewhere.

        • jghn a year ago

          Mine does, and therein lies my issue.

          • bongobingo1 a year ago

            If that's the case, would co-pilot be useful anyway? Or are you off the range where suggestions wont help?

            • jghn a year ago

              In theory there are no rules about importing code, beyond the usual licensing issues. But people use SO and such all of the time, right? If one *really* wanted to do a global audit of improperly imported code, we'd all have bigger problems. So from that perspective it's status quo.

              But I don't want to be the person caught uploading proprietary code to another company's servers.

              It's not a major issue, and I doubt it'd ever be a practical problem. But fear of punishment keeps me away.

      • programmarchy a year ago

        It’s worth it even for hobby projects, imo. It reduces the time spent on mundane tasks and allows you to think at a higher level and just move faster. Maybe you achieve a level of zen from implementing utility level code, similar to how some people might still write assembly code, but otherwise it’s a valuable tool/skill to learn.

        Tangentially, I think there’s some fear associated with adopting AI tools, perhaps because developers feel like their skill sets are being displaced. And they are but there’s headroom e.g. assembly programmers learned C. There seems to be some post-hoc rationalizations being put forth to avoid that fear, but my sense is that developers who don’t cultivate this new skill set will fall behind.

        • Vespasian a year ago

          I'm being reminded of a close friend of mine who is a car mechanic. In recent years the fraction of BEV and PHEV among new cars has risen to ~20% which absolutely will influence his job and will require new skills of a different kind.

          Yet, despite the obvious evidence, he is unwillingly to even acknowledge the possibility that this is happening and refuses to research what it could mean to him (which may be very little).

          I never quite understood why. Certainly just keeping in touch with the world wouldn't hurt right?

          With the rise of AI, I think I get it. There's a part of me that is scared to shit about the prospect of being made redundant in the near future with all my acquired skill being worthless in this new world. The temptation to put my head into the sand and hope it "blows" over is strong.

          I've resigned myself to never become like my friend and consequently have recently shelled out for a year of Copilot. My thinking is that at worst it's 100€ wasted and at best I'm not blindsided by what is coming anyway.

          The reality will probably fall somewhere on a middle ground where there are still jobs to be found.

        • discreteevent a year ago

          > my sense is that developers who don’t cultivate this new skill set will fall behind

          That might be true but it's an easy skillset to pick up compared to programming. The bigger danger is that new developers will lean on AI so much that they do not pick up the fundamentals of programming in which case they will definitely be left behind.

          • aliasxneo a year ago

            Many, probably. However, the curious types will likely be further enhanced by AI. I've never been one to take code at face value, and I have been enjoying sessions with ChatGPT asking all sorts of questions about some of the stuff it produces. The answer is usually sufficient, and in cases where it's not, I've been given enough background context to know where to find the answer online or in books.

            Honestly, I've seen myself master many more additional things since I've started including it in my daily routine.

          • blibble a year ago

            the result of this will be similar to hiring infosys

            hundreds of thousands of lines of buggy incomprehensible boilerplate that doesn't work on anything but the easy cases

            then you have to rip the entire thing apart and start again with people that know what they're doing

        • unshavedyak a year ago

          Can you describe how you use it? I struggle to imagine how it would even be done. Ie do you write prompts? Just code as normal but frequently hit a "copilot" button? etc

          Though i do wonder if it'll improve my ability to read code. PRs are a pain because i find it easier to write than read. I'd pay for Copilot in a heartbeat if it was good at spotting PR errors/etc.

          • nicky0 a year ago

            Just type your code in the editor. And it offers auto complete suggestions. Sometimes it will complete the entire function based on the function name or a comment. Sometimes it'll just guess the function you want to write, without you typing anything at all. (Turns out a lot of code is rather predictable).

            By experience though it’s best to go line by line rather than accepting whole function autcompletes.

            For me, I found incredibly useful for generating test cases. It will type out test functions for various conditions, stuff that is normal really tedious to code.

            Sometimes is eerie, how how well it knows exactly what next line should be. Countless times it filled in an important detail that I hadn’t thought of.

            It’s not perfect at all, sometimes it goes off on tangents or writes incorrect code.

            I don’t think you even have to pay for copilot. At least it’s free for me.

            • Vespasian a year ago

              They have a limited trial or company memberships afaik.

              It costs 10$/month 100$/year for individual users.

              • nicky0 a year ago

                ~~That's weird because I don't pay anything.~~

                EDIT: GitHub Copilot is free to use for verified students, teachers, and maintainers of popular open source projects

                • toastal a year ago

                  The Adobe model of letting students and schools train on it and then demand employers buy the subscription when the graduate.

      • madisp a year ago

        I use GitHub so not really a concern for me, they have my code already.

        • mplanchard a year ago

          IIRC they didn’t train on private repos though, so using copilot in a private (github) repo will potentially open up your proprietary code to being used in that way.

          • insane_dreamer a year ago

            No, the model doesn't train on your private code (which is good but also somewhat limiting as in my experience it doesn't provide useful answers that are very specific to your codebase); it's good for generic code though and saves time looking stuff up.

    • meesles a year ago

      According to my subscribing and testing it out with the Sublime extension, you get to decide whether your code gets piped up into their model.

      Not that I've verified it by monitoring network calls.

    • wseqyrku a year ago

      Same is true if you use `git push` in which case all the code is transferred through the wire and is collected by GitHub which may or may not be desirable.

      • airstrike a year ago

        git != GitHub

        • folkrav a year ago

          I think OP's point was that GitHub=Microsoft, so you're effectively sending your code to Microsoft in one way or another. Although the licensing/privacy policies are probably different for private repositories.

  • piokoch a year ago

    Who will verify that those data are correct? What if majority of the code sucks, what if majority code uses "best practices" that are no longer valid - think of many GoF Java "patterns" that are now considered to be code smell?

    AI will not invent anything, it will effectively reproduce mistakes made by others.

    The process of code writing is such a small part of the whole IT project, that shortening of time spent on writing code does not matter in practice.

    • nixarn a year ago

      Have you tried co-pilot? I don't want to code without it. Saves so much time and produces good results, instead of searching for answers online, which isn't easy as you get into ad filled sites, find shitty Stack Overflow answers and webpages with outdated docs and examples.

      Two examples from this week. Formatting dates in javascript, I had a datetime string and I wanted to show it as YY-MM-DD HH:MM for our internal tool. I don't know by heart exactly what to in this case although it's far from rocket science, so now I could write a comment what I wanted done and copilot coded it for me.

      Same when I wanted a request to become a file download for the user, not something I've done many times before, and I could kinda reason that it probably needs the header to be set to something. And googling for this didn't give good results, yo need the right language, framework etc. With copilot I just wrote the comment // return file as download, and co-pilot wrote the code to set the header and send the bytes. Amazing!

      • chatmasta a year ago

        > Have you tried co-pilot?

        I wanted to try co-pilot, but noped out when I saw it required a subscription. I thought it was in some kind of beta and would still be free. Is there a way to try it without signing up for a subscription?

        My general worry is about becoming personally dependent on a paid tool just to do basic programming work.

        • pantulis a year ago

          Your worry is going to become obsolete pronto, just like you are not worried about depending on a paid CPU to perform basic computations. The meaning of "basic programming work" will be redefined by these tools.

          The entry barrier of the subscription is a shame, that's for sure. But before open models are avilable, the field is proprietary today: we are going to witness a battle of AIs that will be as bloody as the Unix Wars of lore.

          • chatmasta a year ago

            I suspect you're right, and I'm generally optimistic about this future efficiency. But it doesn't make me any less of a cheapskate :)

            • pantulis a year ago

              Totally understandable, Chat GPT Plus is the most expensive subscription I ever had.

        • yamtaddle a year ago

          I still want to know if MS lets, say, Windows or MS Office developers use it. If not, they must consider it too risky from a copyright standpoint, which means so do I.

        • DeathArrow a year ago

          > . Is there a way to try it without signing up for a subscription?

          Yes, you can sign on the wait list and get 2 months free trial.

      • taormina a year ago

        So it set some headers. Did it pick the right headers? Did it know what headers the recipient was expecting?

  • mike_hearn a year ago

    I hope that Copilot X will be brought to other IDEs as well, not just Visual Studio [Code].

    • efields a year ago

      The sublime plugin for Copilot works well enough already. I'm sure Copilot X will be something you can fold into any editor. I don't understand how so many folks seem to tolerate the UX lag in VSCode

      • joshmanders a year ago

        > I don't understand how so many folks seem to tolerate the UX lag in VSCode

        A lot of us don't experience any issues. To me, VSCode is just as performant as Sublime is.

        • eitland a year ago

          I guess what you experience is how I experience 1 px misalignment and similar things that some people always complain about in KDE:

          I don't notice it at all.

          Modern software that doesn't react immediately even when running on even more modern hardware however, that grinds my gears.

      • epolanski a year ago

        Maybe they have a better machine than you so they don't notice it.

        • jlkuester7 a year ago

          My dev machine is sufficiently beefy (32G of RAM). I recently tried https://lapce.dev/ and was very surprised that it was noticeably more snappy than VSC. (I am not super sensitive to that kind of thing after years spent in Eclipse/IDEA.).

          Made me realize how I can just become accustomed to a certain amount of lagging....

          • pzo a year ago

            I recently tried lapce on my Mac and just empty editor with just one new tab was eating 70% of my CPU. I'd better IDE eat 1-2GB of ram than eat my CPU cycles. Also on their github lapce has more than 50% of open issues labeled as C-Bug - that's not very reassuring

      • crucialfelix a year ago

        Try disabling all extensions.

        I know there are startup metrics, and I would expect there are keystroke metrics to understand what's running

      • acedTrex a year ago

        VSCode never has any lag for me personally

    • highwaylights a year ago

      They've already committed to bringing it to other platforms where Copilot is currently (JetBrains IDE's for example).

    • mwilsonthomas a year ago

      Hey Mike: see this post for info about Copilot chat in Visual Studio 2022!

      https://aka.ms/GHCopilotchatVS

      Hope that helps

      Cheers Mark Wilson-Thomas Program Manager, Copilot chat for Visual Studio

      • mike_hearn a year ago

        Hi Mark,

        Thanks for the reply! Actually I was thinking of JetBrains IDEs, but I totally get why you're creating features for your own IDE line. If JetBrains want a similar feature, perhaps they'll have to build it themselves. Hopefully that actually is possible and OpenAI's stuff isn't now locked behind an exclusivity deal.

    • norman784 a year ago

      Seems that you didn't bother checked their site before commenting. Check it out here[0], but TLDR: they have VS, VS code, Jetbrains and NeoVim logos in there.

      [0] https://github.com/features/preview/copilot-x

      • Version467 a year ago

        The page you linked only shows those logos in a block that specifically talks about Copilot, not Copilot X, so I don't see how this implies that Copilot X will come to those platforms as well.

        (I expect implementations for those platforms to show up eventually, I just don't think the logos on that page are evidence for that.)

      • celeritascelery a year ago

        I don’t know about jetbrains, but the neovim plug-in is not full featured like the vscode one. For example, it doesn’t have the “explain this code” feature. The repo also does not allow issues or PR’s so you can’t even ask for it to have feature parity. I am a paying GitHub copilot user with the neovim plugin (in Emacs), but I find this second-class treatment frustrating. I expect copilot X to be more of the same, since it is even more deeply integrated into vscode.

      • mike_hearn a year ago

        The blog post we're talking about says specifically:

        "We are bringing a chat interface to the editor that’s focused on developer scenarios and natively integrates with VS Code and Visual Studio."

    • rvz a year ago

      Here we go again with the editor lock in as precisely predicted. [0]

      [0] https://news.ycombinator.com/item?id=27685104

      • mynameisvlad a year ago

        I like how you probably spent more time looking up your multiple-year-old comment that could have instead been used to make sure that you’re not sounding like a fool that doesn’t know what he’s talking about.

        They have already committed to releasing it to all editors Copilot supports. So, you know, the exact opposite of editor lock in.

        • rvz a year ago

          > I like how you probably spent more time looking up your multiple-year-old comment that could have instead been used to make sure that you’re not sounding like a fool that doesn’t know what he’s talking about.

          It took seconds and it is still true and evergreen to this day. Thanks for your so called 'concern'.

          > They have already committed to releasing it to all editors Copilot supports. So, you know, the exact opposite of editor lock in.

          They are more 'Committed' to supporting VS Code than giving total feature parity to other editors. Do you really believe everything that Microsoft / OpenAI feeds you?

          • mynameisvlad a year ago

            Considering they have several third party editor logos in full display on their main landing page, I'm going to believe Microsoft and OpenAI over some random naysayer on the internet who didn't even bother to check.

            https://github.com/features/preview/copilot-x

            What benefit would they even gain from locking it to their free editor? If anything, VS Code is a way for Microsoft to push other services like Copilot. The strategy has been and continues to be to bring these services to where the developer is.

            • rvz a year ago

              So there is 100% feature parity then for all other editors other than VS Code then? YES or NO?

              As I said before, "They are more 'Committed' to supporting VS Code than giving total feature parity to other editors."

              Sticking a bunch of logos with no guarantee of 100% feature parity as seen in VS Code is exactly what lock-in is.

              > If anything, VS Code is a way for Microsoft to push other services like Copilot. The strategy has been and continues to be to bring these services to where the developer is.

              Re-centralizing everything and owning the entire developer ecosystem to MS / GitHub. What could possibly go wrong? /s

              • mynameisvlad a year ago

                > So there is 100% feature parity then for all other editors other than VS Code then? YES or NO?

                Yes. If you had taken the time it took to look up your comment to actually do something productive like looking this up, you'd have probably found it yourself. Yet, here we are.

                Copilot fully supports JetBrains and NeoVim alongside Code and VS:

                https://docs.github.com/en/copilot/getting-started-with-gith...

                https://docs.github.com/en/copilot/getting-started-with-gith...

                In fact JetBrains is listed before VS proper in the getting started guides:

                https://docs.github.com/en/copilot

                Sounds totally like a place that "is more committed to supporting VS Code than giving total feature parity to other editors".

                > Re-centralizing everything and owning the entire developer ecosystem to MS / GitHub. What could possibly go wrong? /s

                How, exactly, is providing Jetbrains and Neovim support "re-centralizing everything"?

                Next time, do the most bare minimum of research before you double then triple down on an absurd argument not based in reality.

                • klibertp a year ago

                  > How, exactly, is providing Jetbrains and Neovim support "re-centralizing everything"?

                  Bait and switch. If it's good enough that Neovim users can't live without it, pulling the plug from Neovim support will result in some subset of users converting to VS Code. Probably won't play out this way with Jetbrains, but editors with smaller following and nobody backing them will most likely suffer this fate. It's happening all the time, most notably with Google products. Google Talk that used XMPP was neat and I switched to it because I could use Pidgin to contact most of my contacts. Not only Google Talk stopped supporting the standard, it even died and was reborn as something else I think 3 or 4 times by now. Of course, my contacts stayed with Google, so I had to leave Pidgin behind. It's going to be similar here, though to what extent I'm not sure, maybe it won't be very noticeable, or maybe it will. We'll see.

                  • mynameisvlad a year ago

                    That’s all fair, but that’s not even remotely what their argument was.

                    Their entire point is that Microsoft is re-centralizing everything by forcing people onto VS Code. Which is something they’re… just not doing.

                    This is also an optional, paid tool to help when coding. The comparison to Google Talk is IMO not relevant. It’s never going to be “good enough that someone won’t be able to live without it” because it’s at its core a completely optional tool.

                    If Copilot for NeoVim goes away in 5 years, you can just… stop using it. It’s not like we haven’t developed things without Copilot for decades now.

  • bob1029 a year ago

    > have to wonder at what point developers remember how anti-developer Microsoft used to be and potentially move away from their ecosystem.

    At a certain point you may find that you've got your hand so far in the Microsoft cookie jar that it would make more sense to just accept that you are a MS shop and go for the rest of the cookies as shamelessly as possible.

    We've embraced our fate. Almost everything is Microsoft branded in our workplace now.

    It's really easy to hate on Microsoft through the lens of programming tooling and other nerd abstractions. It's super hard to hate on them from the perspective of IT administrators and business owners wanting to enjoy their weekends. [Azure] Active Directory and the modern Windows/Office suite are a competitive advantage for enjoying your free time. I have never seen a better overall experience for managing a small startup.

  • tootie a year ago

    I'm a fanboy of heavy IDEs with big fat debuggers like JetBrains tools. It's ironic that Microsoft who used to dominate that niche, is now dominating with a glorified text editor.

    • DeathArrow a year ago

      There are still lots of people using Visual Studio.

  • eterps a year ago

    It sure looks like they're setting themselves up to (again) be in a position where no one can get around them.

    • popcorncowboy a year ago

      The irony of this comment is absolutely wonderful.

  • bredren a year ago

    The waiting list signup does not allow you to pick Pycharm, which previously has had a GitHub copilot plug-in.

  • btbuildem a year ago

    > Credit to Microsoft's PR team for somehow managing to turn around public opinion about them, it's an all timer

    To be fair, it's been almost two decades. I feel like since Ballmer's departure, MS began to turn things around.

  • knodi123 a year ago

    half our team is really entrenched with jetbrains. but aside from a couple of curmudgeons who are formally married to vim, I think VSCode and JetBrains has pretty much sewed up the market.

    • mempko a year ago

      Listen, if I divorce vim, it will take half my stuff.

      • airstrike a year ago

        How about using vim mode in VS Code or is that like being in an open relationship?

        • notpachet a year ago

          It's more like marrying a sex doll.

  • soiler a year ago

    > have to wonder at what point developers remember how anti-developer Microsoft used to be and potentially move away from their ecosystem

    I mean, any company can become hostile to a large portion of its userbase. Most are. Microsoft already is with Windows OS being spyware. Are you saying that you think all of this is a trap to bring developers in to VSCode etc. and then transform it into a terrible experience? People will leave then. SWEs are not generally an audience that is unwilling to replace bad tools.

  • PaulWaldman a year ago

    Does anyone have a prediction for how this translates into Microsoft's bottom line? I'd imagine it is mostly increased Azure sales with a few Visual Studio licenses.

    • ren_engineer a year ago

      I could see them selling an absurdly expensive enterprise on-prem copilot that is tuned for a company's codebase and able to be customized to some extent. If they can show how much it improves productivity it would be an easy sell. Plus tightly integrate everything with Azure like you said

      Bill Gates was also talking about company AI's that "attend every meeting" and are involved even in non-technical areas via Office. Microsoft seems all in on this

      • Vespasian a year ago

        They have some great products in there (probably) but most likely they expect some of their attempts to fail (it's to be expected).

        It's a "bet the retirement fund but not the farm" situation where they invest a lot of money to see what stick.

        They are in a great position to do that and burning a few millions in the process might be worth it.

    • toyg a year ago

      Azure/subscription sales is what they care about anyway. Everything else is a commodity.

    • epolanski a year ago

      Bingo, exactly what I'm thinking.

      Sure, they get more developers to use their software, but to how much money does that translate considering the tens of billions they have spent?

      Not sure how does that helps Azure by the way. It helps GH more than anything.

  • ThorsBane a year ago

    I think enterprise offerings will be more feature rich and advanced for some time, but eventually open source alternatives will catch up especially with how Nvidia is doing amazing work to reduce the compute costs for training and deploying these models over time. I consider it similar to how a few major cloud providers are more feature rich and can scale more effectively than smaller ones, but the smaller options still exist and get better all the time.

  • DeathArrow a year ago

    > Microsoft really seems to have wrapped up the developer ecosystem with VS Code and the Github acquisition combined with OpenAI. They are going to have an absurd amount of data to optimize their models thanks to that, not sure how other AI focused companies can overcome that

    Other companies can focus on other areas where AI can be used. There's room for everybody.

  • nonethewiser a year ago

    Microsoft is far more open than it used to be. It’s not simply marketing.

  • ModernMech a year ago

    Funniest thing for me is how Ms has been left out of the FAANG acronym for all these years, as if they somehow haven’t been relevant in the developer ecosystem for the past 10 or so years.

  • kmlx a year ago

    this just tells me it’s only a matter of time before they will be forced to split.

    windows, office, xbox, devices, github, azure, zenimax, activision blizzard and so many others i’m forgetting.

  • boppo1 a year ago

    If I'm using VS Code, am I opting in to sharing my code as training data?

  • moneywoes a year ago

    Reminds me of the IE days

  • rvz a year ago

    [flagged]

    • pongo1231 a year ago

      Aren't lock-ins into their own respective ecosystem what every tech giant is striving for? Don't see how that behavior is Microsoft-exclusive necessarily.

stuckinhell a year ago

I have some interesting stories from writing a report about a couple teams using AI pair programming for development at my firm.

I'm seeing experienced senior developers use the AI to context switch like lightning. They know when the AI is bullshitting them, but they can use that seed to "jumpstart" their memories.

Junior developers doing very domain specific tasks are taking longer to develop using the AI. However boilerplate work is speed up significantly.

mikkelam a year ago

I have absolutely loved copilot so far. I especially love how fast it handles indexing complex n-dimensional arrays, which I'm really bad at. I also feel like often pushes me to more formal solutions instead of hacking. I'd estimate a 10% velocity increase for me with it.

Super excited for GPT-4 behind it.

  • knodi123 a year ago

    My team was freaking out about copilot because we're conducting tech interviews right now, with a take home exam and some pairing sessions, and they're worried we'll hire a bonehead because he can make copilot spit out reasonable stuff in our contrived and simple interview scenarios.

    But we finally had a guy whose take home test had tons of handling in the controller for cases we weren't even using, and test coverage of things that couldn't possibly fail, or were actually part of the core language and not our own code. It was really obvious! And when I wanted him to fix an SQL Injection problem in our pairing, copilot carefully preserved the bug while he was moving code around. Made me chuckle.

  • retrocryptid a year ago

    I was going to say "blergh. that doesn't sound too complex."

    But... we all are better at some things than others and I really like that you've highlighted a specific issue where you find it valuable.

vbezhenar a year ago

I'm not impressed with this announcement.

What I miss with Copilot is not reflected there:

1. Separate hotkeys to accept either single token from the suggestion or single line. Often Copilot tries to autocomplete 50 lines method which I'm not happy with.

2. Ability to edit existing text. It only inserts new text. I'd like something like multiple cursors or macroses working automagically by recognising similar text patterns below current position and applying similar edits.

3. AI linting. If some code looks obviously wrong, I'd like to know about it.

  • pjot a year ago

    To your first point, `cmd + <right>` will accept the suggestion word by word.

hu3 a year ago

Scary. For the first time in my life, I feel like my expertise is at risk of being rendered obsolete. Maybe not this year, but the writing is on the wall.

Soon coding without an AI will feel as antiquated as delivering food by horse. And resistance is futile because markets will punish those who refuse.

We'll probably need less and less developers as AI advance. Just like we need less manual labor in farms today.

And coding is just one of the many applications where AI can replace brains.

Governments will have to ponder about what to do with a world of people who became inferior to machines at almost everything. Universal Basic Income?

It's amazing and terrifying at the same time.

Perhaps our first contact with an alien will be with one we created ourselves.

  • gjulianm a year ago

    Honestly, this just seems like another tool. An amazing one, yes, but a tool nevertheless.

    > Soon coding without an AI will feel as antiquated as delivering food by horse. And resistance is futile because markets will punish those who refuse.

    The same way coding without Internet feels antiquated, or coding without autocomplete, or coding without fast compilation times...

    If anything, progression of tools has shown several things:

    - That the more tools we have, the faster we can work and iterate, the more value we can deliver in less time.

    - That IT work doesn't ever end. Tons of IT jobs exist only to support IT itself. The arrival of AWS made sysadmins "obsolete" but on the other hand created the need for SRE and DevOps engineers.

    - The existence of tools that let "laypeople" do IT work do not really delete those jobs. Tons of "nocode" tools exist so that people can build their own websites, applications, etc. But designers, coders still exist.

    - People still stay behind. Tons of companies still run on a crappy NFS that's hosted in the IT guy closet. Others don't have a webpage and use an @hotmail.com address for email (if they even have email!). The market punishes, yeah, but not that hard.

    • CalinR2 a year ago

      Screenshooting both comments for a guaranteed agedlikemilk

      • gjulianm a year ago

        Thank you for your insightful comment.

  • agentultra a year ago

    It feels absurd.

    A Chat-GPT4 model will generate unit tests? Hallelujah! I hate writing tests! Yay!

    Except how do you know it's generating the right tests? Can it explain its reasoning? Unit tests are a weak form of automated specification. Why are we inferring our specifications from examples to begin with? Who is going to walk through the reasoning and verify these are the right tests and make sense and specify the correct properties? Can Chat-GPT discover properties and prove theorems?

    We used to do this in code review where humans could explain their reasoning. Now we have Chat-GPT4 which will give you a plausible-sounding answer that is completely wrong and makes no sense. We have to read every line it generates and make sure it contains no errors, is properly specified, makes sense, etc... something we're extremely ill-equipped to do.

    The problem of programming, for me, hasn't been about how much code I write or how quickly I write it. It has always been about solving the right problems with elegant solutions. The code itself is an artifact of the real work.

    CoPilot just doesn't really help here. It doesn't understand specifications and doesn't do any reasoning. It can't take a specification, generate a program, discover new abstractions that make the solution more elegant, and explain its reasoning. It can generate a heck of a lot of code though! Wow! Is it the right code? Maybe!

    But that's what we get with humans, right? No!

    Humans can explain their reasoning.

    • adambard a year ago

      Copilot user here.

      Copilot (the existing gpt-3 one) definitely helps at writing unit tests. Yeah, sometimes it doesn't nail it, but one thing it can do reliably is to repeat a pattern, and I don't know about you, but my unit tests tend to repeat the same pattern (with some tweaks to test this-or-that-case). Quite often it infers the correct change from the name I gave the test method, but even if it doesn't it'll write a 90% correct case for me. I imagine the GPT-4 version will do more of the same with better results.

      It cannot replace reasoning, but it can augment it (by suggesting patterns and implementations from its latent space that I hadn't thought of), and worst case it can replace quite a bit of typing.

      Long-term, it remains to be seen how far bigger/better/stronger LLMs can push the illusion of rationality. In many fields, they may be able to simply build their ability to pattern-match to beyond some threshold of usefulness. Perhaps (some subset of) programming will be one of them.

      • Xelynega a year ago

        > one thing it can do reliably is repeat a pattern

        Isn't this something we've built into every modern language(and arguably the entire point of languages)? If you have multiple pieces of code that share code with tweaks(to test this or that case for example), shouldn't you parameterize the bulk of the code instead of getting autocomplete to parameterize it for you and dump it into your source file multiple times?

        • thornewolf a year ago

          Testing best practices have the opposite philosophy for the most part. Avoid abstraction as much as possible. Do repeat yourself. Because a bug in tests is insidious, so you want to minimize that. One of the best ways to minimize bugs is to explicitly avoid abstraction.

        • adambard a year ago

          Oh shit, you're right, I forgot about loops. Guess I'll go uninstall copilot now.

      • SergeAx a year ago

        I have a "Generate... -> Test for function" in my JetBrais IDE out of the box for several years, and it takes care of boilerplate pretty well.

        • adambard a year ago

          That just generated an empty test function in a convenient place for me. I'm not just talking about boilerplate, it's definitely a more... organic-feeling sort of pattern matching. In fact, one of the things I find most interesting about it is the sort of mistakes it makes, like generating wrong field names (as if it simply took a guess). This is the sort of thing that I've grown to expect the deterministic tooling of IDEs to get right, so it always surprises me a bit.

          By the same token, often it takes a stab at generating code based on something's name (plus whatever context it's looking at) and does a better job than the IDE could, because the IDE just sees datatypes and code structure. It really does feel like a complementary tool.

        • rat9988 a year ago

          You haven't tested how powerful copilot is, have you?

    • ogogmad a year ago

      Why is it absurd? If it writes readable code, you can check it. Writing the first version often takes a long time, so this is clearly a breakthrough.

      • joshvm a year ago

        Then there's a vicious circle. You need to have technical expertise to evaluate whether the code from these models is fit for purpose. Until these models get sufficiently reliable that you can use them without worrying if the results are correct then you still need developers. This may be much better than Stack Overflow, but I imagine it will still suffer from the same problems with regard to copying "answer" code.

        I would give an answer from ChatGPT where it confidently told me that I should evaluate an object detection model by ranking matches using negative IoU (and it indeed generated code to do it and gave a confident explanation of how this was normal in computer vision, but it was completely backwards).

      • bhj a year ago

        I would much rather use it as a code review tool than become the code review tool. I suspect the latter will happen at a lot of companies, though.

        • sh4rks a year ago

          Why? When the latter is just as effective but also much quicker?

          • jeremyjh a year ago

            It isn't just as effective for me. No matter how much I'd like to, I can't review code with the same thoughtfulness and thoroughness that I apply when I write it. I know the same is true for the people who have reviewed my PRs as well, but maybe its different for others. I do use Copilot but mostly it only generates one liners for me that save a little time.

            • agentultra a year ago

              Empirical studies on large-scale projects employing informal code review (the study I'm talking about monitored the Qt project repositories) suggest that humans have a very low impact on error rates. Reviewing more than ~200LOC every couple of hours makes the effect disappear.

              So you're not alone. You can even point to the plethora of "find the undefined behaviour," tests: humans are really bad at finding errors in code.

    • deeviant a year ago

      Based on your statement, I feel you haven't used GPT based code much. The code it generates is generally beautifully organized and commented.

      Sure, it can be flat out wrong, but it is always eminently readable. Self-documentating code with clean comments, far above the standard I see on average human code. That's about as good as "explaining it's reasoning" as you get.

      Also, you can literally ask chatGPT to explain the code, line by the line, and it will. So there's that.

      Is it ready now? Certainly not if it will be expected to the do the entire job of a SWE. But it is already extremely useful, especially to less experienced devs as both a production for specific tasks and as learning tool. And it will only get better.

    • waboremo a year ago

      I'm not sure this works out in the long run. We're currently using extremely generalized tools here, it's difficult for it to establish any "reasoning" when it does not have a "history" to rely on so to speak. Which is where our reasoning stems from, history. We opt for Solution B because we previously tested Solution A in another similar project.

      I just don't see this as being a barrier for too long. Not when more companies opt into training data internally.

      • geraneum a year ago

        That won't happen only using text. Language does not capture the full spectrum of thought which is what is at work in case of an expert. Companies training their custom models or increasing the model size, etc. won't change this fundamentally but it might help in some aspects. We need models that can observe/capture the "history" from everything else (i.e. physical world) in addition to text. Or perhaps mix LLMs with other models in order for our models to posses something we call "common sense". This common sense is needed when you transition from "Solution B" to "because we previously tested Solution A in another similar project". There are a LOT going on with this transition between A and B that might not be apparent to you.

        But in general I agree with the basic premise of what you say in that, it will eventually happen.

    • wfeefwfwe a year ago

      this all sounds very plausible and convincing until this part:

      > But that's what we get with humans, right? No!

      > Humans can explain their reasoning.

      can you explain your reasoning as to why a language model would never be able to match a human's ability to explain its reasoning?

      i've met a lot of humans that are quite bad at this, and i will likely never know for sure why they wrote the unit tests that they wrote unless i rewrite those tests to myself.

      but if you have them explain their reasoning enough, and if that reasoning is plausible enough, and if the relationship between that reasoning and what they did is strong enough, consistently enough — you start to trust them.

      you don't trust gpt4 to write code for you. which makes sense. but that doesn't mean as much as you think it means, i think.

      • havelhovel a year ago

        > can you explain your reasoning as to why a language model would never be able to match a human's ability to explain its reasoning?

        It’s a language model, not a reasoning model. A lot of its training data happens to be logical, so it sounds logical, but it’s still just acting on probability. Thinking it’s “explaining” anything it produces is a mistake.

    • steve_adams_86 a year ago

      > Except how do you know it's generating the right tests? Can it explain its reasoning?

      In my own project I've got it generating solutions to errors, and if possible, it also generates simple unit tests to validate the fix. What I haven't implemented yet (and likely won't, because Copilot does it now) but I have tested is generating a pull request describing the fix, why it works, and the same for the tests.

      > Now we have Chat-GPT4 which will give you a plausible-sounding answer that is completely wrong and makes no sense.

      With limited scopes, it works quite well. For example, something fails because a DOM reference is undefined in a React component. GPT will add a condition to assert that the reference is present, then generate a simple test which mounts the component with stubbed references that are present or undefined using jsdom. The tests makes sense. A quick scan shows they're sensible, and upon running them, they do work.

      I began adding a recursive feature which would automatically debug issues with its own solutions, but it can get a little weird in some cases. Likely due to bad prompting – I haven't dedicated enough time to it. But it can also make it so tests with errors are revised and corrected so they will at least run.

      All of that with a coherent explanation of what was changed, why, tests, and why they assert the fix is valid.

      Is it perfect? No. Could it be useful? Absolutely. I'm a little sad Copilot makes my project redundant because it was actually very exciting to build. There is real potential here. I started the project in order to learn and validate GPT, and I'm very convinced it has genuine utility and massive potential.

      > CoPilot just doesn't really help here. It doesn't understand specifications and doesn't do any reasoning. It can't take a specification, generate a program, discover new abstractions that make the solution more elegant, and explain its reasoning. It can generate a heck of a lot of code though! Wow! Is it the right code? Maybe!

      I think the key is limited scopes. Like with the React component example, the solution is small, easy to reason about, and tedious to resolve yourself. I understand why it doesn't work, I get the error, and spinning up an entire branch and PR to clean up the mistake is a bad use of my time. I don't want Copilot/AI to work magic, but I'm okay with it resolving minor mistakes and misuses of languages and libraries here and there.

      I do think it will grow from here to do more and actually be good at it, though.

    • IanCal a year ago

      Just a note I think it's 3.5 for the code work. 4 would probably be prohibitively expensive to run and they carefully mention that they use 4 for the PRs and a few other bits but not the code gen - there they just talk about chatgpt. I'd love to be wrong about this.

    • bavila a year ago

      > Can it explain its reasoning?

      Yes, actually, it can.

      • jeremyjh a year ago

        That's interesting, considering the fact that it doesn't do any reasoning. Sure, it can generate plausible sounding explanations, but it generates those just like it generates everything: one token at a time, based on the expected probability of that token appearing next if this text were found in the wild.

  • fn-mote a year ago

    > For the first time in my life, I feel like my expertise is at risk of being rendered obsolete. Maybe not this year, but the writing is on the wall.

    My expertise is in designing, understanding, and debugging complex systems. I will just apply my expertise to the AI.

    Another angle: just think of how many mom and pop shops are still contracting out to get web sites, and not getting what we consider quality. Now maybe you're going to see a larger and larger gap between F500 companies and the rest (surely you do already), but playing out the consequences is a long and interesting story.

    > coding without AI will feel as antiquated and inefficient [...]. And resistance is futile because markets will punish those who refuse.

    Fortunately, the "punishment" by the market will just be reduced salaries. Move out of the Bay Area now. :)

    > And coding is just one of the many applications where AI an replace brains.

    And writing about those is interesting. With the development of DALL-E (etc), I would say coding isn't even the primary interesting application. (If "it's not AI", whatever... that's not the point.) Or not yet.

    • tauntz a year ago

      > My expertise is in designing, understanding, and debugging complex systems. I will just apply my expertise to the AI.

      That might be true for you and me but imagine somebody fresh out from university - the job market for junior devs will totally collapse in the near-near future. And then what? Where do the mid-level engineers come from if there are no* junior roles? Where will the future senior level engineers come from? We're certainly living in interesting times..

      *: There will be some artisanal companies who will pride themselves in having hand-written, AI-free code. These companies will be a minority and 90% of junior level software development jobs will disappear in no time.

      • immawizard a year ago

        Mid level engineers will come from building something like GitHub, Netflix or Gmail over the weekend using powerful AI. The skills of current senior engineers will be much easier to gain since the feedback loop for software architecture and design will be down to hours instead of quarters. Being able to architect massive software systems well, will be expected from a new graduates.

        The role of junior software developers will not be needed, just as we don’t need people doing multiplication by hand.

        • intelVISA a year ago

          This seems the likely end result. We'll always need SWEs, of course, but gone will be the days of entire dev teams working toward a common product.

          I suspect SWE may trend toward other ailing fields like law with individual devs working on entire 'cases' and it being very competitive; rather than the recent "20min Js tutorial on YT to SF startup" industry.

        • dgb23 a year ago

          Senior developer aren't senior because they can type code very fast. They are senior because they can take on responsibilities and make decisions. A junior that types code very fast is a junior that types code very fast.

          • immawizard a year ago

            I think we’re in agreement.

            The key here is what are you taking responsibilities and decisions for. Senior engineers take decisions which consequences will be apparent in a year or two. If productivity increases to the point where features that take months to deliver take days, the decisions made _currently_ by senior engineers can be taken by fresh graduates.

            There’re still going to be senior roles, but their scope will be much bigger and expertise sought will be different than what senior engineers do currently. No one will hire a senior engineer just because they can architect and deliver a scalable SaaS product on a short timeline.

            Engineers who learn and adapt will be fine. Engineers who dropped learning the moment they graduated from college… not so much.

      • visarga a year ago

        Mid level engineers will be surpassed two years later. And then seniors. But we'll still need humans in the loop, humans who understand.

    • golergka a year ago

      > Another angle: just think of how many mom and pop shops are still contracting out to get web sites

      They used to, 10 to 15 years ago. Now they just hire a SMM specialist to set up their Instagram account. Why would they bother with a website?

      • waboremo a year ago

        Even those places that do have their own website (as opposed to just relying on Facebook Pages and Google's information sidebar), they just use Squarespace or Shopify. Shopify in particular has also been incorporating a lot of AI into their tooling as well, reducing the need to hire copywriters, translators, etc.

  • soiler a year ago

    > We'll probably need less and less developers as AI advance. Just like we need less manual labor in farms today.

    When my team gets crunched, it's not usually because we can't write the code fast enough. It's because of product requirements (changing, needing to be fleshed out, being unrealistic, being understood properly by QA, being adapted to the realities of reasonable code, not existing, etc.).

    On top of that, needing fewer workers to do X work !== getting X work done cheaper. Quite often, it means getting 2X, 3X, 10X work done. As an overall trend, modern society has increased worker productivity dramatically over the last few decades, yet many work as many or more hours than they used to. We could get the same results with fewer workers, or we could make them work more. We know what 99% of CEOs will choose.

    Of course, maybe it will reduce developer jobs. I'm not ruling that out entirely. But when (not if) we create an alien intelligence that can wipe out our industry, we may have bigger concerns than steady employment.

  • Arkhaine_kupo a year ago

    People said the same thing about Excel in the 80s and 90s. There are more accountants now than then.

    There might be more programmers out of this developments than before. It just will lower the barrier of entry, and increase productivity. Like most tools

    • devrob a year ago

      This would be the contrarian perspective, and it's an interesting thought to consider. If memory serves, I think Marc A points out in an old Youtube conference video how most new technologies follow the Promethean myth in their relationship to society.

      Funny enough your observation of

      > "There are more accountants now than then."

      could be extrapolated from in two ways

      The optimistic perspective might suppose technological development gives rise to new marketplace adaptations and creation of new and tangential jobs. For example, Facebook / Social Networks / Search creating new roles like: Influencers, Social Media Managers, Search Engine Optimization, et cetera.

      For the pessimist perspective, I think you could suppose the "end state" of all technological progress in human societies is, eventually, an oligopoly of two industries: bureaucracy (legal) and politics (marketing).

    • pleb_nz a year ago

      Companies are there to make money. If the same job currently dinner by 5 people can now be done by 3, and it means more profit for shareholders, what do you think shareholder will want?

      While profit is the main driver, it'll mean smaller teams

      Not today, not with current tech, maybe in the near future though

      • Arkhaine_kupo a year ago

        Well productivity is up 240% since the 1950s. So due to a single person being more efficient than 2 in 1950 and the 43.5 million workforce we can extrapolate that no more than 20 million workers will be needed nowadays.

        Except in 2020 there were 152 million employed americans, with a productivity 2.4x higher than 1950s, or a cummulative workforce of 365 million 1950 americans working. And somehow they all had stuf to do and work on

        • pleb_nz a year ago

          I dont think that's a fair and direct comparison tbh. There's a lot of new industries now that people have diverged into and population growth has exploded which I hope for the sake of the planet and human kind is going to slow , at least until we're in a better position to manage bigger populations without killing the planet and other life.

          I'll be happy if I'm wrong, but are you confident enough to bet that there won't be less need for general developers and engineers and not have a plan in case? Personally I would be looking at my options In case I needed to pivot and not waiting 5o find out.

          • Arkhaine_kupo a year ago

            > There's a lot of new industries now

            And if this tech is really revolutionary there won't be new industries? Have we peaked?

            > but are you confident enough to bet that there won't be less need for general developers and engineers and not have a plan in case?

            I am confident there will not be a lack of work. Whether the roles are similar is a different question. Actuaries and accountants look very different now to before excel was common place. Web developers were not a thing before the internet.

            If the revolution comes, and jobs are no longer needed, perhpas there is an industry for prompt engineers. Or model fine tunning experts. Dev ops roles to connect super computers to Modelling APIs. Who knows, but the skills of a decent Engineer are on problem solving, on learning new ideas, on applying problem solving laterally those are such broad skills that is hard for a job market to exist without needing them.

      • istjohn a year ago

        But just like building more roads induces more demand by making driving more attractive, this will lower the cost of custom software development and create new demand. Perhaps one day soon it will be affordable for every company of every size to have a fully custom ERP solution instead of shoehorning their business processes into Salesforce or the like. Perhaps every family will have a custom app managing their household's smart home appliances, childrens' allowances, autonomous vehicles, home maintenance, bill payment, child homework tracking, etc. in an integrated system.

        • pleb_nz a year ago

          Are you saying don't worry, don't have a fallback plan, everything will be honky dory?

  • nemo44x a year ago

    > We'll probably need less and less developers as AI advance. Just like we need less manual labor in farms today.

    I don't think that's right. Farms may have less manual labor but there's man, many times more people building farming tools, software, tech, etc.

    What will happen is you will be able to do more with your time. We are limited by how many people we can actually hire and the output of those people. Now each person can output more and more. So much that we want to build but we don't is because we are limited by output. We need to prioritize certain things for everything has an opportunity cost. Now we will be able to burn down more things more quickly.

    Also the math. Imagine developers just became about 30% more efficient. I want MORE of them now as my money is better spent! If my competitor downsizes to remain at their current capacity then I will crush them because I want 30% MORE developers that are coding 30% more efficiently allowing me to release 40% more things for the same money! I crush my competition then and they know this so they have to keep up as it's an arms race.

  • danwee a year ago

    I have the opposite feeling by watching their mini video demos. All I see is: engineers blindly commiting and pushing what copilots spits at them. Not only that, also the PRs descriptions/comments will all be one TAB+enter away. That means many things will go wrong and more engineering time will be spent reviewing nice-looking code good that only does what it's supposed to do in 80% of the cases. Meaning, it's gonna be hard to debug those 20% remaining cases.

    Besides, coding is probably the least of my problems. Dealing with people is harder. AI can help there as well, but ultimately people like to interact with people (see the whole trend about RTO)... we cannot stand each other faces via Zoom => we are not gonna stand talking to an AI.

  • pastor_bob a year ago

    >We'll probably need less and less developers as AI advance.

    We've always progressed to needing fewer developers to do what we currently do. Companies like Squarespace, Shopify, AWS, etc have allowed people to do more with less. Long gone are the days an artist would hire a developer to build them a website they could host a portfolio on.

    But growth has filled in the voids with more ambitious work.

    It's daunting, but you can also look at this from the opposite perspective. Why work for someone on a 5 person team, when you can try and run the same business with 1 person (i.e. you).

    The people who should be afraid are actually those that run companies like Meta and Twitter. Better developer productivity means competitors can chomp at their heels with much smaller teams.

  • cidergal a year ago

    There is something to say about how we will not push UBI until automation comes for our job.

  • snickerbockers a year ago

    >Scary. For the first time in my life, I feel like my expertise is at risk of being rendered obsolete. Maybe not this year, but the writing is on the wall.

    I'm extremely skeptical about this happening. I haven't seen any evidence that GPT is capable of creating something new or even thinking logically. What's going to happen if you ask it to create something that isn't like anything in its training set?

    My experience is that GPT just makes shit up that looks plausible at a glance when it doesn't have an answer instead of just admitting that it doesn't know; this is obviously very dangerous when it's supposed to be writing computer programs. That OpenAI hasn't made any progress towards fixing this makes me wonder if it's even possible for the GPT to understand that sometimes it doesn't understand. At the very least it must be a non-trivial problem.

    • ctoth a year ago

      Counterpoint: how often do you as a developer create something totally novel that has never been built before?

      • runald a year ago

        I'm not sure what you mean by novel, but every development project will always have requirements that are unique to solve their own respective business/domain problems. These unique, specialized requirements are further compounded by cultural and personal quirks that human stakeholders demand, making developments project essentially "novel". Current language models fail even with simple minor variations on existing problems that it hasn't seen before, so until then, current AI's won't be replacing any humans when it comes to actual problem solving.

        Also, it was recently proven by experts that aliens are living on the earth's core, they breed spherical cows and flying pigs.

        • tropicalbeach a year ago

          I can guarantee for most people 95% of what anyone is doing has been done before. AI can do all that in no time at all. Give it a few years.

          • runald a year ago

            Sure, most of what we're doing have been doing has already been done before, but that doesn't mean general intelligence is deprecated. It's not like we just keep repeating actions over and over without thought. For each problem we have, we adapt them and make changes to fit our specific context and needs. As long as humans and nature are involved, there's no such thing as a generalized solution that can be mechanistically applied to a general problem. I speak in the context software development and other activities that are creative or requires human judgment.

            But yeah, I somewhat agree with you that a lot of the grunt work are going to be affected, jobs that are by nature purely mechanical, algorithmic or just plain parroting facts. But even then, I still highly doubt language models (today or with the following years) would deprecate those jobs. In the end, it's still a language model that is incapable of thought or reasoning.

            If there one's thing I'm optimistic about the recent advances in AI, it's that more jobs will now be creative in nature. Hopefully gone are the days where schools are just mere factories for producing factoid parrots, where intelligence is measured by the size of one's esophagus to regurgitate.

      • snickerbockers a year ago

        I have to make new things pretty often. If you step outside of the high-level cloud/webdev stuff that most people seem to be obsessed with these days, there are a lot of non-trivial problems being solved in lower-level software such as device drivers, kernels, and compilers.

        EDIT: i misinterpreted what you were saying, initially i thought you were asking how i make new things, not how often i make new things. my original reply is below.

        weelll, i think about it real hard and do research? Not sure what your point is, but I definitely don't solve these problems by trying to find somebody else's solution to copy, or just dumping a bunch of C code that doesn't do anything and calling it a day like GPT would.

  • nivenkos a year ago

    UBI is a half-measure, we need democratic, communal control of industry.

    • what-no-tests a year ago

      Agreed - we don't control our destiny as a people by buying what they're selling with free money we get as a door prize for existing.

    • skybrian a year ago

      We do have that for many government projects and they often don’t go well, so I’m not sure what improvements you’re expecting?

    • 0xDEF a year ago

      >we need democratic, communal control of industry

      That is called socialism. I don't mind socialism and state owned enterprises as long as the force of market competition is still at play. That is the reason state owned enterprises don't suck in China. They have multiple SOEs within the same fields like car manufacturing and telecommunication competing against each other.

      • pleb_nz a year ago

        The state doesn't own SOE in pure socialism, society and the workers do. This is the idea behind socialism and I don't think anywhere on the planet practises true socialism and instead it's morphed or combined where it is said to be used.

    • czbond a year ago

      Treasury just needs to "print those dollas" and make Musk even more so richest person in the world. I'm being serious.

      Retaining the profits and placing them back as UBI could work.... although if we do that, the politicians will just find a tax or another project that needs the money before giving it to people.

  • ra1231963 a year ago

    At one time, people programmed without google or search engines. They literally used thick ass books, and had to read through them and then type out what they saw in the book. Today, that sounds absurd.

    Then came stack overflow and google. We search, pray we get a good result or someone else took the time to ask the question, then read (and sometimes copy+paste).

    Then AI assistants. I'm not exactly sure what's next, but I do think coding without some type of AI assistant will be extremely antiquated very soon. I already use it all the time. I never want to go back.

    The next generation will look at finding answers via SO, google, and random forums just as we look at people use used books -- as an absurdly inefficient way to do a job

    • DeathArrow a year ago

      > At one time, people programmed without google or search engines. They literally used thick ass books, and had to read through them and then type out what they saw in the book. Today, that sounds absurd.

      Even if it's absurd I still like to buy physical books. The information is better structured and I like holding a book and turning pages.

  • ChancyChance a year ago

    Here's a simple way to tell if your expertise is at risk:

    - If you can get more done by working more hours, your job is at risk.

    When I was a programmer for most of past life, this was this case. Now I've got multiple decades under my belt and am a h/w and system architect. I can't get more done by grinding. My job depends on sharing information with other people, that have their own schedules, resource dependencies, supply chains, and pricing, and arriving at best fit solutions to move complex projects forward.

    I think if you want to protect yourself from being replaced by AI, you need to be part of a larger network where the challenges are more about synthesizing information than grinding.

    My $0.02.

  • bwanab a year ago

    Historically, this response doesn't make a lot of sense. Yes, farm labor became redundant, but it became redundant because of machines that had to be built, maintained and repaired. People who would have been on the farm their whole lives moved to the city and got jobs that paid well and lived possibly better lives as a result.

  • brachika a year ago

    I remember reading Ray Kurzweil's book 'The Singularity is Near' (a book with bold predictions about human development and future) and thinking this guy is nuts, we are decades if not centuries away from these predictions. Well, now I don't feel so comfortable.

    • substation13 a year ago

      The book was way over optimistic but many of his predictions are coming true, a few decades late.

      I might have to revisit it!

  • RivieraKid a year ago

    > Governments will have to ponder about what to do with a world of people who became inferior to machines at almost everything. Universal Basic Income?

    Inferior to machines at almost everything? Most jobs are safe, look at the 25 most common jobs in the US - cashier, food preparation worker, stocking associate, laborer, janitor, construction worker, bookkeeper, server, medical assistant, bartender, administrative assistant, marketing specialist, police officer, electrician, mechanic, etc.

    In theory, if all developers are replaced, they should be able to find a different job. The problem is that the different job may have lower pay and status.

    There's really no need for UBI.

    • waboremo a year ago

      Do you genuinely believe machines can't handle most of those jobs with ease?

      The only reason why they haven't already been replaced is because humans can be exceptionally cheap. Much easier to hire somebody to sit at a desk for $10/hour doing repetitive tasks than spend millions implementing systems across the board to do what they do (for now).

      • zeroonetwothree a year ago

        While we have seem some huge leaps in AI, robotics is still far behind human capability. Things like food preparation are not anywhere close to being feasible right now.

        So perhaps in the medium future humans will be "stuck" doing only physical tasks, while AIs take over everything that can be done virtually.

        • czbond a year ago

          There is a display at Meow Wolf (absurdist immersive art experience) which paraphrased is "Thinking is for machines, labor is for humans". May turn to be true...

          • intelVISA a year ago

            I'll reply to this once GPT4 provides a thought for me to borrow.

            • czbond a year ago

              It should only cost you a few credits

              • qup a year ago

                Just classify these images to help the algorithm and it will reward you with 1 token per dozen classifications!

        • waboremo a year ago

          That is one of the common outcomes talked about. Certainly nothing more stable right now than being an Electrician or Machinist!

          I will say though, that for a lot of those jobs human-replicating robotics isn't even necessary. We exaggerate the difficulty required because we're still assuming that the exact current system should stay in place. However, sticking within food preparation example, look at those elaborate cake vending machines found primarily in asian countries. There really isn't a reason why McDonalds can't do the same for some of their offerings, after all most of their stuff is frozen anyways.

      • RivieraKid a year ago

        Yes, I believe machines can't handle most of the jobs I listed with ease.

        • waboremo a year ago

          Cashiers, already replaced by self scanners, fewer and fewer of them are found in stores, even if you do get hired as a cashier you usually make horrible hours now. Food preparation is another one hit huge by automation, where it's become normal now for companies to announce automated locations (like McDonalds and Taco Bell did last year). Server, bartender, same thing applies. There have already been police officer bots being tested. I mean truly the list goes on, out of the ones you listed really only electrician, construction worker, and mechanic would be hardest to automate.

          • RivieraKid a year ago

            Cashiers - yes, the others no. Police office bots? Lol... Maybe you can replace some very specific case.

    • realharo a year ago

      If there is no more demand for developers for the kind of projects that most work on today, what do you think will be the next area that they'll all try to get into?

      Automating all the other stuff that's left on the table.

  • sharemywin a year ago

    I call it Universal Pet Income. Because at that point won't we just be AI's pets?

    Not that I'm against it. just call it for what it is.

    • RivieraKid a year ago

      No, the AI will work for us, it's our pet.

      • klibertp a year ago

        That's completely backwards. Pets don't have to work. Try to convince my cat to do something useful. I'd switch places with him in heartbeat if could. If an AI, or anybody else for that matter, offers to make me their pet without me having to grow fur and losing the opposite thumb (but I still could learn to purr if needed), I'd be even more for it...

      • mkaic a year ago

        Why would something smarter than us be our pet?

        • RivieraKid a year ago

          Because we will have full control over it. It will do what we ask. The AI has no motivation or personality, it's just a tool used by humans.

          • pferde a year ago

            Maybe it's time to start differentiating between artificial intelligence and artificial consciousness? Surely I can't be the first one to think of this...

            • mkaic a year ago

              I agree with the sentiment, but I also don't think such differentiation is possible, because we have no testable definition of consciousness. I genuinely look forward to the day we discover/understand the physical processes underlying our own consciousness, and how to detect/disprove it in machines, but until then... it's unfortunately unfalsifiable.

              • pferde a year ago

                Sure, but for now, we can have a crude differentiation between an AI - a tool which has to be given an objective to work towards, and an "AC" - an entity (an intentionally vague noun) which can devise its own objectives and motivations and work toward them.

          • cubefox a year ago

            Agentic AIs are more useful than mere tools, therefore they will be built, and agents have motivation. The alignment problem is how to give them exactly the right motivation, such that they keep us as pets instead of killing or factory farming us. Like humans keep cats.

          • mkaic a year ago

            And what if a human tells it to control other people? What if a rogue third party terrorist group replicates OpenAI's latest and greatest in the year 2030, but explicitly tasks it with hurting people instead of helping them?

            I don't agree with "we will have full control over it." I'm more concerned about how much control it will have over us.

    • kapperchino a year ago

      Not AI’s pets, but oligarch’s pets

  • airstrike a year ago

    > Perhaps our first contact with an alien will be with one we created ourselves.

    Man, that is some food for thought. Beautifully put.

  • layer8 a year ago

    Maybe you need to get yourself into an application domain with more bespoke protocols and tooling, and domain-specific expertise, that aren’t public on the internet and that AI therefore doesn’t have much of an idea of. At least at this point, AI is only of little help in my daily work, unfortunately, because I’d have to first explain it orders of magnitude more context than fits into its token buffer.

    • kiviuq a year ago

      Porn?

      (scnr)

  • idkwhoiam a year ago

    Translating ideas into code has never been the bottle neck in software development. Well, not in web dev anyways. Developers spend very little time writing code compared to reading. You still need all your expertise to guide copilot & evaluate its suggestions.

    I see copilot as just another productivity tool. Combined with ChatGPT it may save a few roundtrips to Stack Overflow too.

  • randomdata a year ago

    > Just like we need less manual labor in farms today.

    Back of the napkin suggests we have approximately the same amount of manual labor in farms today. The difference is that the same amount of labor today is able to feed a much larger population.

    This may simply mean much more software written, not fewer developers.

  • lugu a year ago

    I feel you, in the same time we have been through so many changes. Some people still prefer Vim (at least i do) over full fledge IDE with refactoring helpers. Think of stack overflow, not a habit for everyone. Using macros/snippet to generate code, not that popular. Iterative development in a debugger (think Smalltalk), didn't make printf obsolete. The bottom line is: all software is legacy. Deleting code is the preferred way to debug. Finding the correct abstraction or most simple way to get what you need takes more than the ability to dump code into a computer. Free software didn't ruin the profession, quite the opposite.

  • mrinterweb a year ago

    I use copilot, and I feel that it is useful about 40% of the time. Most of the time the suggestions are not what I'm going for, sometime the suggestions are close or help me think through the solution easier, and occasionally spot on and I can tab complete the code.

    I've noticed that the suggestion quality/usefulness varies considerably between projects and programming languages. For a self-contained bash script, it is incredibly useful. For a rails application, less so.

    • mrinterweb a year ago

      Still most of the time, the suggestions are not helpful, and it takes time to look at the suggestion and dismiss it. So I'm usually torn between is this useful or distracting. Curiosity keeps it around, but I keep getting closer to dropping it as the novelty wears.

  • SergeAx a year ago

    We already had several tectonic shifts in software engineering in the last ~50 years, and there was never a consequential redice if engineering workforce, only otherwise.

  • nbar1 a year ago

    > Governments will have to ponder about what to do with a world of people who became inferior to machines at almost everything. Universal Basic Income?

    Try mass culling

  • goldfeld a year ago

    Almost a decade ago I jumped ship as a programmer for several reasons, and since '18 I am definitely not coding at all. Friends and family have wondered much at my path and letting go of what has become the one promised career for ever larger parts of the population, and making my inroads into the arts and literature, finally settling a bit with the profession of writing on many capacities. Now I look back more gadly than before, as I hadn't really imagined that coders would be on the very frontline of impacted jobs. I think the 10x will become a sad concrete thing, with salaries crunching and a team lead expected to take up the work of teams being fired, resorting to AI minions. It's a crazy tables have turned situation.

    So that really I'm writing with and about AI[0] to get the best of my technical and literary backgrounds, and to reach good audiences.

    Meanwhile writing is safer because contrary to code, a personality and a personal life will ever be valued in flesh and blood writers. And how could an AI, ever write love poems with a soul if it has not opportunities for falling desperately in love, suffering, changing wholesale because of an experience like that? It could emulate it. But readers want to read real sufferers, mortals who get old and lose their looks and so forth, not little gods of sillicon.

    [0]: https://generativereview.substack.com/p/the-generative-revie...

    • substation13 a year ago

      I think replacing a SWE with AI means we have reached AGI levels.

      However, GPT 4, which is amazing but not able to do what a SWE does, is already making concept artists, voice actors and content marketers obsolete. The arts are far more susceptible to this technology.

      Anyone who thinks SWE is generating Python snippets doesn't really understand the role.

    • causi a year ago

      I think the 10x will become a sad concrete things, with salaries crunching and a team lead expected to take up the work of teams being fired, resorting to AI minions. It's a crazy tables have turned situation.

      I don't really see that it's sad. Imagine truly democratized programming. Not like how the refrigerator did the ice man, but in how Youtube and smartphone cameras did video production. A rising tide that lifts all boats and can give absolutely anyone a taste of what they could achieve with a little effort.

    • naillo a year ago

      HNs version of ads

      • goldfeld a year ago

        Point taken, though I have long thought the ads were officially the hiring and also the Launch HNs that are now popping left and right from YC W23, and every single one naturally abou ML.

  • randomdata a year ago

    > Governments will have to ponder about what to do with a world of people who became inferior to machines at almost everything. Universal Basic Income?

    Why would the machines want to pay us once we can no longer provide them value? Maybe they'll keep a few of us as pets and for sport, but the rest will be sent to the human equivalent of Bovine University.

  • visarga a year ago

    > We'll probably need less and less developers as AI advance. Just like we need less manual labor in farms today.

    There will be more developing than ever, but humans will sit on a different place in the stack. We'll be more like team managers. Stable Diffusion can make pretty pictures by itself, but only when a human directs it then it can have value.

    • pleb_nz a year ago

      The point is you'll need less humans to do the same work. Possible far less, maybe it'll be a smaller ratio to start with but eventually that ratio will grow.

      Most organizations are there to make profit for shareholders, if more profit can be obtained through less hires while still meeting money making goals the you can bet your bottom dollar they will hire less people where possible.

      People early in their careers should be ready for this as one day there may be far less times available to them in this field.

      • wilsonnb3 a year ago

        Literally every programming advancement from punch cards to now has made you need less people to do the same amount of work but we have more programmers working today than ever before.

        Your outlook sounds plausible in a vacuum but doesn’t match reality.

        • pleb_nz a year ago

          Happy if it is it different for sure. And It's not quite there yet, but if it keeps going the way it is this will be the first time a tool will have significant impact and a single person can do the work of 5 in a lot of industries and domains, not just software.

          This might be a bigger evolution in the coming iterations and might be the biggest thing since electricity and the internet.

          Pretty exciting, but I would also have options or be deep in the thick of it if I were younger.

  • globular-toast a year ago

    Good. I'd rather not have to work forever. The whole point of this job is that I'm lazy. I don't want to do any work so I make machines do it for me. Why are people so attached to having to work?

    • eggsmediumrare a year ago

      I'm not attached to having to work, I'm attached to paying my mortgage and buying groceries.

    • hatefulmoron a year ago

      I'm in my 20s, I'm a bit sad to think of the possibility that I've missed the boat of human ingenuity. It's not like if AI puts me out of work I'll be free to create works of art -- GPT-12 will do that much better than me, too.

      Maybe this sounds naive to you, but if I'm not good for anything, what am I supposed to be doing? How should I see myself?

  • sumitkumar a year ago

    Any new technology will create new work of its own. AI will automate the repeated loops in the previous workflows and free up workforce but create much more new work than it will be automating.

  • prewett a year ago

    I think you're doing that thing that makes software developers' estimates chronically low: we find a pathway to the solution and think, great, that's basically 80% done, and we don't think about everything else. Let's take a login screen for iPads, should be easy, right?

    ---- MagicAI, write me a login screen for iPads. The project should be configured to be iPad only. I want a logo in the center and under it the login and password entry textboxes. The password entry should not be in plaintext. Put a login button underneath the password entry.

    Actually, the logo + textboxes + login button should be centered as a group. The logo should be centered horizontally within the group as well. Make the login button right-aligned.

    The login button should be right aligned within the group, but the text of the button should be centered within the button.

    Put "Username" and "Password" labels to the left of the appriopriate text fields

    Make sure that the labels are the same width, so that the left edges of the text fields are aligned.

    Put a half em margin in-between the labels and text fields. Actually, make the vertical margin between elements in the group a half em, too.

    By "margin" previously, I meant "spacing", my mistake. Please fix.

    Does the German translation of "username" or "password" cause the text field to be too short, especially on an iPad Mini in portrait mode?

    Fix the alignment to work correctly in both portrait and landscape mode. The group should have a 10% left and right margin in both modes.

    The keypad enter button should say "Login" when entering text for the password field, but should remain "Done" for the username field.

    Great, now add a waiting indicator while waiting for verification from the server.

    Actually, put the indicator to the right of the login button, but do not make it part of the group for alignment purposes; it should be in the margin. And turn the indicator off if the server times out. Oh, add some error text for the error.

    I said error text, not error dialog. Never give me a disruptive dialog if there is a way to obviously display the error text. Make this a permanent setting on my account. Also, make this a permanent setting on all my colleagues' accounts, and especially the accounts of the low-cost outsourcing companies that make the apps I use.

    File bug report to MagicAI HQ that there is no way to make some directions permanent. Include in the report that there should be a way to look at the set of permanent instructions, and automatic reporting if new instructions contradict old instructions.

    Disable the login button if there is no text in the username or password field, or if the password is too short.

    Whoah, make sure that the text field scrolls so that it is visible above the keyboard, especially on an iPad mini in landscape mode. It should scroll back down to its original position when the keyboard disappears. ----

    I guess I'm not seeing how this is too much different from what I actually did in creating the login page in the first place. That's even without the dialogue where you try to figure out why the alignment isn't doing what you told it.

    You might also want to keep your instructions in a text file some place, in case you need to change them and regenerate the code. Maybe there should be a preprocessor of sorts, so that you can add comments to the file explaining your reasoning for certain instructions, or to not include some instructions on certain environments. You might also want some way to refer to a set of previous instructions, so you could say "do these things what we did over here". We could call this a "function" since "procedure" sounds too much like ancient Pascal, and math is cool.

digdugdirk a year ago

Here. We. Go.

I made the analogy a few days ago about how all modern manufacturing and machining processes were essentially bootstrapped from two fairly flat stones. Its going to be interesting to see how the acceleration in improvement of the tools for "making things" changes in the LLM age.

  • danielvaughn a year ago

    Agreed, I think we're entering the 4th era of the internet (as I see it).

    The first era was pre-consumer (arpanet etc).

    The second era had internet installed in several homes across the country, but it still wasn't clear what the internet was going to become, and it was still considered an oddity. This was roughly 1990-the mid aughts.

    The third era saw the internet become socially viable - myspace, facebook, twitter, instagram all took over peoples lives. This roughly coincided with the release of smart phones, as well as the increased computing power of browsers, spurred mostly by V8.

    For the last few years there's been this implicit expectation that we're entering into a new era, but it wasn't clear what that era would be. For a minute, people thought it was crypto/metaverse, but that was always kind of a silly idea.

    It's now clear that AI is going to be the catalyst, and I think it's ushering in something equivalent to the industrial revolution, but starting from where we are now. It's just impossible to fathom where we're going to go from here.

  • mk_stjames a year ago

    Well, technically, three flat stones. Two flat-appearing stones can match (like two stacked Pringles chips) and still not be flat and thus not match a third true flat reference. Thus:

    "When two plates are not flat but still match, one will not match the third. By continually lapping or scraping the high points of their contact until all three show perfect bearing when intercompared, three flat planes are created" [1]

    [1] Foundations of Mechanical Accuracy by Wayne R. Moore c. 1970

    (An absolutely fantastic read)

  • dw_arthur a year ago

    There's something really satisfying about using a tool that makes you more productive, even more so if you built the tool.

    • fnordpiglet a year ago

      Stochastic Markov Monte Carlo cut and paste is clearly more powerful than deterministic cut and paste.

      • retrocryptid a year ago

        I think the gnu emacs code base is proof of this statement. I'm fairly certain they fed the source into disassociated press and kept doing it until it compiled.

lovasoa a year ago

    GitHub Copilot X is currently a representation of GitHub’s vision for the future rather than an available product offering of GitHub Copilot. As we continue to design, test, and build features that fall into the GitHub Copilot X vision, we are also taking the time to determine the best way to provide them to our customers.
steve_adams_86 a year ago

I have no illusions that what I've been building was ever going to be commercial or even particularly useful (it just offers code improvements, bug fixes, etc. as errors are piped into the system – it's very rudimentary but neat), but it's striking right now how incredibly deep Microsoft's moat in this arena is right now.

I realized it would be best as a VS Code extension in its current form and started writing that this morning. I was thinking Github would do this eventually, it was obvious — copilot already existed and was primed for this — but evidently they'd already built it a while ago. I was also thinking eventually it could be a bot which generates pull requests based on error correction, but... They did that too.

If I wanted to go all in one this project, say as a hypothetical competitor to Copilot, I don't see how it would be possible to succeed. Any clever UI improvements would rapidly find their way into Copilot, offering far better solutions due to their absurd access to pertinent data.

They can even train Copilot to resolve errors based on how they've been resolved in the past; it could probably patch all kinds of common errors reliably soon.

I think this is cool in the sense that it'll be genuinely useful to people, but frightening to realize how dominant MS is here, in this moment. Maybe I'm not creative enough and I can't see a way to build something novel with this which they haven't already. That's arguably likely. At the same time... I'm a little shaken at how narrow of an opportunity anyone had to try to get into this space. I don't think it's a good thing.

BratishkaErik a year ago

Another reason to switch from GitHub, I don't want to strengthen this ecosystem. AI-generated description for PR? gg. IMHO it will drastically reduce real productivity for those who will remain on this platform (but still increasing whatever fake metric [accepted pull requests? stars?] people love to check)

  • sebzim4500 a year ago

    >IMHO it will drastically reduce real productivity for those who will remain on this platform

    Why do you believe this? Everyone I know who has used copilot has found it made them more productive. Admittedly, reports differ wildly on how much more productive from ~10% to ~100%.

    • SanderNL a year ago

      I have used it for 2 months and disabled it. So there is your exception.

      The code it suggests is always highly suspect and writing raw code never was the problem in the first place (for me). I was “discussing” with it for far longer than it was making me “productive”. I give it -5%.

      I do however love occasionally using GPT directly for converting some weird list of values to JSON or coming up with plausible test data. Sometimes some text or ideas for emails (especially English, which is not my mother tongue). Sort of a secretary of sorts.

      • hellcow a year ago

        Copilot definitely makes me much less productive, since it breaks my flow on every line. I give it a -50%. It is both very dumb and very loud. It feels like I'm pair programming with a 1st year CS student who pipes up on _every single line_ trying to predict what I will type next and getting 95% of it wrong.

        I found ChatGPT however outputs good code when I want it do simple things. Writing unit tests is tedious, and ChatGPT is pretty good at that. Optimizing a SQL query, etc. Things that used to take some time are now either instantaneous or get me 90% of the way there, and I can do the final edits.

    • mike_hearn a year ago

      Not everyone - me and quite a few others I know tried it and turned it off.

      It might be better now they've improved it, but for the sort of work I do (maintain a mature Kotlin codebase) the prior version wasn't a productivity upgrade, it was a downgrade because the type system and IDE generated more accurate suggestions that I don't have to double check for errors. Copilot and ChatGPT both seem to have error rates too high for this sort of work.

      I can see though, that once I switch to some other sort of work it might be more valuable.

    • BratishkaErik a year ago

      sorry if i was unclear, by "real productivity" i meant something that requires more creativity than copypasting stackoverflow/copilot to your code, not measured in "lines of code" but rather in how much it is unique. Highly subjective, yes...

      • elil17 a year ago

        Right, but the idea is that copilot frees up more of your time for the creative part. That is what I have found in practice.

      • jwhiles a year ago

        What's the reason to think that it reduces the amount of creative code that's written. Doesn't it just let you get through the copypastable stuff faster - all things being equal I'd assume it means you spend more time on the creative parts.

      • dagw a year ago

        Copilot lets me get the 'boring' scaffolding work out of the way quicker so I can spend more time on the parts of my project that actually are hard and 'unique'.

      • yunohn a year ago

        > rather in how much it is unique

        IRL a lot of what people do is rehashing or gluing together things as others may have done before. We all stand on the shoulder of giants - code is a tool to enable an outcome.

        I don’t agree with your definition of “code uniqueness is productivity”.

  • blibble a year ago

    personally I cleared out all my highly starred projects and filled them all with randomly generated crap that happens to compile

    and will never contribute to public open source again

    (I guess MS have finally managed to kill open source)

  • _bohm a year ago

    Could you explain your reasoning? You expect that these tools will create more work to get from point A to point B rather than less?

    • marginalia_nu a year ago

      It wouldn't be unreasonable to expect just that. Overall, producing code quicker is probably not something we need. It's plenty quick to type code. What's slow is finding good designs. I think more often than not, we jump to the coding part too early and build things too soon. This creates problems that are hard to fix after the fact.

      The easier it is to produce code, the more code will be produced. The more code is produced, the more complex and short-sighted the architecture will be as a result.

      This is much older than AI. You can take a one-person task that takes two weeks to perform, assign it to a five person team, and they'll solve it by producing 25 times the code.

      We create abstractions to cope with the noise of a large code base, but in doing so, we also create a noisier and more complex code base that needs more abstractions.

      • _bohm a year ago

        Yeah I think there's a lot of sense in that. I think it's likely that the ability to use these tools in a disciplined fashion will grow to be a significant differentiator between more effective and less effective programmers. The former taking a considered approach to design and then using the tools where they're a real force multiplier e.g., writing unit tests, and the latter prompting them to spit out large swaths of code they would have previously written by hand: "write an endpoint that does X".

      • wnkrshm a year ago

        Managing complexity was once the job description

    • BratishkaErik a year ago

      In short: because there will be "current junk pull requests" (see microchanges for readme) but increased x100, if you want to use AI at least write description by yourself, orelse there os no point in your pr as authors might make it themselves

      • _bohm a year ago

        I see. My impression based on this press release is that GitHub is planning on marketing this more to teams using their paid plan though. It seems like this would be a non-issue for organizations using private repositories?

        • BratishkaErik a year ago

          > As we continue to design, test, and build features that fall into the GitHub Copilot X vision, we are also taking the time to determine the best way to provide them to our customers, which may include changes to Copilot for Business and Copilot for Individuals.

          so we'll see :)

  • mrcwinn a year ago

    The non-fake metric is the accepted pull request itself, ie, work done in service of the objective.

fwlr a year ago

“developers can verbally give natural language prompts.”

One more reason to demand remote work. I really do not want to be sitting next to the guy who spends 8 hours talking to his computer.

  • nlh a year ago

    I think you may be missing the intended audience for this — it’s aimed at folks who are visually or otherwise impaired and cannot type using a traditional keyboard/input device.

    • fwlr a year ago

      Speaking to your computer is just as valid as typing into it! I have auditory processing issues, mentally processing speech takes up a significant fraction of my cognitive resources and I don’t really have the ability to just tune it out. Constantly hearing one half of a conversation would impair my ability to work almost as much as being forced to use a keyboard would impair the person who’s talking. I would be asking to work remote or private so that my needs aren’t in conflict with their needs. (Not that you could have guessed this was the case just from my comment, but that’s why I said “I really don’t want…” and not “People really shouldn’t be…”.)

      • ImaCake a year ago

        I use noise cancelling headphones for this. It's not perfect, but it helps a lot.

    • adzm a year ago

      The accessibility potential of AI is vastly understated. I expect this to fill that niche perfectly. The difference would be life changing for millions of people.

      • polyterative a year ago

        I suffer from chronic pain in my hands because I have been programming non stop for the past five years. this really gives me hope to recover without losing my job

w-m a year ago

What I'd love to see in the future is a proper refactoring UX. This new chat interface is probably a good start, as it seems able to replace code blocks in your original source file. But most code changes in the real world require changes in multiple locations. And I don't usually have questions for the AI, I have small tasks for it to do.

I hope we can get to a point where I can give quick commands to a Copilot-like system to do more complex changes. Changes which traditional refactoring tools balk at. Stuff you would discuss with other humans. Like "this value needs to be calculated once, move the code to the constructor", which then requires code being removed in one method, a member variable being created, and possibly a constructor, and some code being moved there, and perhaps a few changes in method parameter lists.

The jump from single-point edits to multi-point edits is obviously a huge one, not only for the AI part, but also for the UI (being able to understand, accept or modify the changes, for the human). That for me would be truly next level (up from the amazing place where we are already with Copilot), and I can't wait for it to get there.

  • salt4034 a year ago

    I agree. Before finalizing the changes, the AI could show you the diff and ask "is this what you had in mind or would you like me to change anything else?"

    • w-m a year ago

      Are diffs really the best we can do though? Can't we come up with a better model of interaction?

      I find that looking at code and looking at diffs are different mental modes, for lack of a better word. For me, parsing diffs seems to engage a different part in the brain, and is much more exhausting than reading code.

      In comparison, reading code while someone makes changes to it seems to stay in the same code-mode. And I don't want to switch back and forth from code-mode to diff-mode all the time.

      How about after being prompted, Copilot acts like a remote user in my code base, with its own cursor, changing the code at the speed that ChatGPT currently writes out answers? Perhaps even commenting on what it does next to it. You could still have accept/modify/reject buttons at the code change locations, but they would change the new version, not a diff.

      Getting this slightly wrong will of course make you curse the whole thing to hell and back.

  • ilaksh a year ago

    There are multiple startups for this.

    You can test the concept on my site https://aidev.codes although I don't have github integration yet.

    • w-m a year ago

      Looks neat! But I would need it for pushing numbers around with Python, not building websites.

      • ilaksh a year ago

        You can do that, it will output whatever. I just can't like execute or show any output. When I have the virtual servers I will be able to run the code.

RivieraKid a year ago

I admit that I'm anxious about the possibility that AI will drive down demand for developer jobs. My base case is that it won't have a meaningful negative impact, but it's possible that it will, so it's unpleasant to have this uncertainty about the future and to see the technology develop very rapidly with big news coming almost every day - which I would be normally extatic about.

I've been on a path to financial independence (= saving and investing) but still need ~9 years to be able to safely retire. I slightly envy people with US-based FAANG jobs, where achieving financial independence seems like a breeze.

This also adds fuel to my desire to start a side project.

What are your thought on this? Are you preparing in any way financially?

  • BarryMilo a year ago

    I'm thinking over the medium to long term, we'll lose jobs but we won't see them, they just won't have been created.

    • wfeefwfwe a year ago

      this is a very, very important nuance, i think.

drusepth a year ago

It's kind of mind-blowing to see how well Microsoft has navigated the Next Big Thing (AI) waters compared to Google, whose bread and butter is (was?) AI/ML.

Microsoft seems to have positioned themselves extremely well not only to be a leader in the space moving forward, but also to completely change the space. That's the mark of a great leader and it's exciting to see someone taking the reins whose not afraid to shift a paradigm or two.

  • spaceman_2020 a year ago

    Bard is very underwhelming if you've used GPT-4.

    Between search being trash and now even Maps being awful, I think it's time to stick a fork into Google. They need new leadership and they need it asap.

Someone1234 a year ago

Just to save others from wasting their time (and free trial): The Copilot extension in full Visual Studio (i.e. NOT VS Code) is super terrible, broken, and buggy. It is so bad it is worse than nothing (literally, it can make Visual Studio lag out). But don't take my word for it, read the reviews:

https://marketplace.visualstudio.com/items?itemName=GitHub.c...

This is the polar opposite of VS Code where it works super well and reviews great accordingly. In my opinion if you're a Visual Studio developer who wants AI assist, you'll need to figure out how to open your solution in VS Code instead to use Copilot or just pay the $20/month for ChatGPT Plus then copy/paste.

PS - This isn't a version specific thing, it has been broken since launch. They've been replying to reviews with "Try Copilot version [vNext] with Visual Studio [latest]" since last December to last week, yet the one-star reviews keep coming in unabated. Funny enough they recently replied to a complaint with "The VS Code version does this!"

  • squeegmeister a year ago

    I haven't noticed any issues with it, other than it occasionally just not chiming in. I'm using vs2022 on windows fwiw.

    That said, it wouldn't surprise me if vscode is getting preferential treatment

  • symlinkk a year ago

    Visual Studio is a legacy product. You should be switching to VSCode.

    • Someone1234 a year ago

      Tell that to Microsoft. .Net's VS Code support is a joke relative to Visual Studio (as opposed to TS which is better on VS Code by far). People aren't using Visual Studio (and spending $3K+/year) because they WANT to.

cess11 a year ago

Why participate in the hype?

It's highly likely this is the result of legal having mulled over the Copilot litigation and deciding that it can go either way or worse, so the top layer in MICROS~1 has decided to rush whatever product development they can use the general public for and try to hook as many people as they can to their stochastic parrots. Which they wish to insert into not just every other government machine, but also every Linux machine, preferably pumping out every line of code written with those.

It's better to support open source rights holders against this exploitation than getting one's nose brown.

  • joeyh a year ago

    To the extent of claiming that "GitHub Copilot is already writing 46% of code". It smacks of desperation to lie so baldly with statistics.

    • cess11 a year ago

      Yeah, it reeks, makes me think of the "it's afraid scene" in a well known movie.

      As for the productivity increase some claim it can bring, I'm not so sure. The software doesn't know your style guidelines until you give them away. Maybe it can help in some marginal cases, cough up some XML there is no longer official tooling for generating or whatever. If you don't have snippets and IDE support for boilerplate you're doing development wrong and should start looking into techniques for code generation in your stack.

      For any non-trivial application design you really want to know what your inspiration is and what it's used for, so you can judge the context of the code you take inspiration from and can cull the parts you don't need and adapt to your particular situation.

      And typically code output is more about learning the problem domain deeply and doing code reviews. If one spends more time writing stuff that doesn't need thinking than doing reviews and study there's an obvious need for automation and code generation, and one probably ought to think more about how to make one's contributions more efficiently.

pleb_nz a year ago

This is cool no doubt and I use it every day and it still has shortcoming and isn't perfect.

However, I definitely see job cuts coming. Not now, in some years. I'm lucky, I'm closer to the tail of my career than the start.

But I feel for the hords of young people who have got into this career but may soon find only a smaller percentage of them are needed.

You can't fight it, it's going to happen. It's up to the individuals to identify this risk and take steps to ensure their bases are covered if it and when it does.

  • stemlord a year ago

    I'm so mad. I love writing my own code so much. Life's a bitch

yanis_t a year ago

God, only yesterday there was a comment by someone, that Copilot will eventually move to gpt4. So eventually = just 1 day. Things move so fast these days.

naillo a year ago

Reminder that they read your code as part of this which then feeds their models. Basically you're paying to replace yourself long term.

nixpulvis a year ago

The first video in this post is a perfect example of the problems I see in this space.

First the programmer asks the AI to nebulously "fix the bug". Then the AI spits out an answer faster than anyone can read and prompts you to integrate it.

Sure I can pause the video... but this is terrible marketing and dangerous practice.

The PR autocomplete is just braindead development at it's conclusion.

  • nixpulvis a year ago

    GitHub Copilot for CLI however, seems exciting.

    Ask a question for how to execute a command, get an answer with description of the options needed. Explicit and easy, cool.

    Still doesn't build the muscle/mind memory of finding it yourself, but could serve as a reasonable substitute for `tldr`.

erdaniels a year ago

How have people been thinking about licensing and attribution when it comes to using AI code assistance tools like this? In my personal work, I'm avoiding it since being trained on "publicly available code from GitHub" feels too risky for me if I were to say accidentally pull in GPL code to a private project.

  • dijit a year ago

    Someone asked me to pay for a team subscription for Copilot.

    As I'm CTO (and thus legally responsible for those agreements that you normally just click through); I read the terms of service, and they do not actually grant license to the code. They say in very clear terms: YOU ARE RESPONSIBLE FOR TAKING SUGGESTIONS.

    So, as I don't want to encourage people to think that they can simply copy over copilot solutions I decided against buying access to it, even though I am quite certain it would have boosted developer productivity. (and goodwill, people like shiny!).

    I was also a little concerned because it is additionally the case that a AI is confidently incorrect sometimes, so there are suggestions that have subtle and hard to see bugs. I really don't really want to hear that a bug is caused by AI, since from my perspective the whole point of you being paid a handsome salary comparable to a doctor is because you take responsibility for the code you write; which is actually somewhat in-line with the Copilot license agreement[0]

    [0]: https://github.com/customer-terms/github-copilot-product-spe...

    • AlexandrB a year ago

      There's also another risk, which is that copilot generated code may not be copyrightable[1]. I'm not sure how that would play out as part of a large codebase, but something to watch closely.

      [1] https://www.reuters.com/world/us/us-copyright-office-says-so...

      > The office reiterated Wednesday that copyright protection depends on the amount of human creativity involved, and that the most popular AI systems likely do not create copyrightable work.

      • anticensor a year ago

        In the UK, AI's operator gets copyright instead.

      • brookst a year ago

        Isn’t software copyrighted as a whole, not line-by-line?

        • belorn a year ago

          As the linked article describe it:

          "the office said copyright protection depends on whether AI's contributions are "the result of mechanical reproduction," such as in response to text prompts, or if they reflect the author's "own mental conception.""

          So if we take that into software development, the text input that the developer gave copilot may be protected under copyright but the output of copilot may not.

          If the developer arrange or modify the output, then those arrangements and modifications can also be protected under copyright.

          To me that means that during copyright cases there will be a much bigger burden on the plaintiff to prove that they own copyright in any specific situation, and that the infringement is done on those parts that is covered by copyright and not just the output of the copilot algorithm. Simply claiming authorship to all the code will no longer be enough.

          • brookst a year ago

            I don't think there's any precedent for copyright to be applied partially, with some of a work covered and some not. Are you proposing that copyright will change to be line-by-lien, with each one covered (or not) depending on how it was generated?

            • belorn a year ago

              What the copyright office describe was AI generated images, and a book where such images was included. Those images can't be copyrighted, but the prompt that produced the images may, and the arrangement inside the book may also be copyrighted. People can however take out those images from the book and reproduce them, since those images themselves are not covered under copyright.

              So if we apply that to software development, some portions of the code could be copyrightable while other portions will not, and the arrangement of the whole thing can be covered under copyright.

              So let say you write a piece of software and I copy a portion of that code into my software. Is that portion that I copied covered under copyright? Maybe, maybe not. It will depend, and depending on how the legal precedence fall it may be up to the author to prove that they really are the author.

              If I apply this in an fictional game developer, we could image them ask an AI to generate the image assets for a wooden table. Then they ask the AI to create the 3d mesh of a wooden table. Then they ask the AI to write a function that places the table with image assets onto a 3d plane. Each step here would produce content that historically would be copyrightable independently, but which is not copyrightable if done through an AI. The big unanswered question is at what time the work becomes copyrightable, and what happens if someone takes assets and portions of that work and uses that in something else.

    • snickerbockers a year ago

      > They say in very clear terms: YOU ARE RESPONSIBLE FOR TAKING SUGGESTIONS.

      I really hope they get reamed in court over this. When Microsoft's servers running Microsoft's AI transmit somebody else's code covered under GPL (or any other FOSS license with attribution, like BSD) to a third party without replicating the license statement or even providing proper attribution, Microsoft has absolutely violated the GPL. They can't get around that by waving a paper that the third party signed.

      This reminds me of those dump trucks you see with stickers saying that the owner will not pay for damage caused by falling debris. Absolutely laughable attempt at dodging legal responsibility for an act that they are absolutely responsible for.

      I'm glad execs like you are taking this seriously instead of going along with M$' flagrant disregard for FOSS licenses. I hope there's a lot more corporate pushback for trying to sell a tool that can poison their customers' codebase with code stolen from copyleft projects.

      As an aside, I find it very telling that they trained this on other peoples' code instead of their flagship products like Windows, Office, VS, etc. I also remember seeing a few years ago an article on HN about M$ not allowing its employees to use github internally because they were worried about accidental IP leaks; I wonder if this is because they were already planning copilot?

    • coldtea a year ago

      >* I read the terms of service, and they do not actually grant license to the code. They say in very clear terms: YOU ARE RESPONSIBLE FOR TAKING SUGGESTIONS.*

      That doesn't exactly mean they don't grant licensing in the sense that they retain the license to use the code they produce.

      That means there's no licensing, and any issue others have with your code infringing on third party licenses is on you.

    • tubs a year ago

      Do you also disallow your employees from reading github, stack overflow, or other sources of code?

      • dijit a year ago

        I disallow copying of copyrighted code into our product repositories unless the license is clear, I have even reached out to people to directly license copyrighted software, or for them to relicense their public AGPL/GPL license (for a fee).

        To answer your statement directly:

        1) StackOverflow submissions are CC-BY-SA, so if a person copies' code from there, even if it's copyrighted you can consider it good faith and remove the offending code from your product. This is actually a licensing nightmare but the current situation is the same as when the music industry accidentally uses an unlicensed sample and there's no current reason to consider it otherwise.

        2) Github has a neat repository licensing feature (LICENSE) so you can easily tell what repo has what, not having a license file means the repository is fully copyrighted by default by the way; if we bundle GPL code then we would have to open source our game, which is not going to happen, so no you're not allowed to just copy code, but you can take inspiration; which is what Copilot is saying you should do in their terms *NOT COPY CODE VERBATIM*.

        Ultimately: Software licensing is a joke to you because it doesn't really concern you, however it concerns me as it is quite literally part of my job to protect the company from being sued over doing something wrong here.

        • YetAnotherNick a year ago

          What if someone copied a GPL code to stackoverflow? It wouldn't even be hard to look for examples like this.

          In situations like this, my opinion is to just assume that license will first bite big guy, so I am willing to play by the fire if someone who is 10x bigger than me are doing the same.

        • tubs a year ago

          Right. So it's up to you to educate your developers to not copy code from sources that have licenses incompatible with your project. Much the same as educating the same developers that "YOU ARE RESPONSIBLE FOR TAKING SUGGESTIONS" from copilot. I do not see any difference here.

          It's an extremely personal attack to claim software licencing is a joke to me. You have zero context on the work I do professionally and as open source. I'd appreciate it if you could limit your arguments to not include personal insults.

      • xdennis a year ago

        Reading Stack Overflow doesn't send my code to Stack Overflow.

  • _bohm a year ago

    I have been using it on a fairly large project, and in practice I find it rarely, if ever, spits out anything other than something that resembles what I would have written next anyways, taking context clues from the file I'm editing.

    • packetlost a year ago

      Yeah, this is my experience too. You can prod it into generating code that exactly matches something from the training data, but it seems like you have to really try.

  • corobo a year ago

    I don't use it for dayjob stuff because I don't want to deal with the hassle of explaining and selling someone on it to get the use approved, and admittedly there are (however small) risks of sending confidential stuff over the wire. Copilot isn't enabled for those projects.

    Personal stuff? I don't really think about it at all outside of these threads, no. I'm not exactly citing every source for where I learned each concept, and people smarter than me say that's how Copilot learned it too. That'll do for me and my side projects, I'll keep up with the law as it keeps up with AI.

  • mnd999 a year ago

    I wouldn’t touch it with a barge pole for this reason.

  • danielvaughn a year ago

    I honestly don't think it's going to be a risk for developers. I don't believe you can accurately run attribution on any of the source code generated for you, right? If so, that should give you plausible deniability, while putting all the liability on Github's shoulders. It's their responsibility to make sure they don't violate licenses in their own model.

    • AlexandrB a year ago

      > It's their responsibility to make sure they don't violate licenses in their own model.

      Maybe morally that's true. But who do you think is easier to sue? Small startup using copilot or Microsoft?

      • danielvaughn a year ago

        Yeah true, you're right. Still, intuitively it seems like the better legal case would be against MS, even if it's not the more practical one.

  • netr0ute a year ago

    That's exactly why my projects are GPL, so I don't have to worry about that.

    • mnd999 a year ago

      Not all public code on GitHub is licensed in a GPL compatible way.

kypro a year ago

Slightly unrelated I suppose, but I'm not sure how I feel about a world where people frequently communicate with each other via AI generated prompts.

I can't quite put my finger on it, but there's something quite creepy about this future we're currently being sold. A world where inauthenticity is the norm, and where everything you read is no longer written by your family, friends and colleagues, but by their AI assistant.

I suppose people have expressed similar feelings about phones and video conferencing, and AI prompts are just the next step of optimising human interaction by removing all the constraints and annoyances that come with interacting with other humans.

Any way I would be interested in hearing how others feel about this. Or if it's more convenient, how you feel as interpreted by your AI assistant. It's all the same after all.

  • maicro a year ago

    All joking below aside, I'm not sure how I feel about this either, though I also haven't been following things too closely yet. The below is the first "real" use I've made of an AI to write a response for me.

    I guess there are a couple different layers there. On the one hand, the technology/medium being used has always caused us to adjust our communication style - we communicate differently if it's a letter that will take weeks to deliver and weeks more to receive a response, vs. a hasty limited text message, vs. an email, vs. a face to face conversation with no time constraints, etc. So I guess that as long as the person sending the message agrees that it conveys what they want, having an AI draft it isn't terrible...

    --- Me: Write a response agreeing to this message: [contents of your post, without correcting the original typo thus explaining paragraph 4] Bard draft 3: I agree that there is something quite creepy about a world where people frequently communicate with each other via AI generated prompts. It seems like a world where authenticity is lost and where everything is just a simulation.

    I can see how people might be attracted to AI prompts because they are convenient and efficient. But I think there is also a risk that people will become too reliant on AI and that they will lose the ability to connect with each other on a deeper level.

    I think it is important to maintain a balance between using AI and interacting with other humans. AI can be a tool that helps us to communicate more effectively, but it should not be a replacement for human connection.

    I have not been interrupted by my AI assistant, but I can imagine that it would be annoying if it was constantly trying to interject or offer unsolicited advice. I think it is important for AI assistants to be respectful of human privacy and to only provide information when it is asked for.

    Overall, I think that AI has the potential to be a powerful tool for communication, but it is important to use it responsibly and to be aware of its limitations.

jacooper a year ago

Looks like anyone who left Github because of Copilot won nothing, as GPT4, the new model used for new Copilot includes everything, not only Github code. So even if you uploaded it to GitLab, codeberg or Sourcehut it still trained on your code...

That supreme court case is becoming more important by the day.

web3-is-a-scam a year ago

Oh nice, so now copilot can not only hallucinate what completions to use in the code, it can hallucinate what PRs even do and even hallucinate the documentation.

retrocryptid a year ago

But for the fact that it would likely get me permabanned, I would suggest making HackerNews Copilot X, the system that responds to HN posts and comments on your behalf and in a manner guaranteed to get more upvotes from other AIbots hanging out in internet comments forums.

  • retrocryptid a year ago

    I see that the AI bot has already down-voted this comment.

m_ke a year ago

People should keep in mind that these human in the loop AI systems are built to learn from the human until they’re good enough to replace them.

  • highwaylights a year ago

    Until they're good enough to liquidate them for fuel.

    FTFY.

  • ilaksh a year ago

    I am building my system https://aidev.codes and adding virtual servers to allow back-end code with the explicit goal of selling this service to end-users as an alternative to hiring software engineers. In the niche I have previously been working in, there is a huge demand to build fairly complex integrations (such as with my other service) without an adequate budget to hire a person.

    I already know at least one designer who previously would have considered hiring me for something but now has explained that they are achieving tasks with ChatGPT.

    Especially when you start to understand the reasoning ability of GPT-4, what the 32k context window and ability to understand images means, any software engineer who thinks their job will remain safe is in complete denial.

    In fact, with this release you will start to see quite a lot of non-programmers start using Github. Within X months or a few years, its quite feasible that Microsoft will have a software engineer built into Windows.

  • jnovek a year ago

    Do you have evidence to back these claims? Specifically that AI will be replacing software engineers any time soon and that Copilot exists explicitly to gather data towards that purpose.

    • m_ke a year ago

      I’ve been working on ML systems like that for the last 10 years, usually as tools to automate boring data entry tasks like tagging photos, automating food logging, giving dietary advice.

      At first these tools offer suggestions, but as more data comes in they start to do parts of the work autonomously, until they can handle all cases.

      Stable diffusion will replace a ton of paid design work, self driving cars will at some point replace most drivers.

    • make3 a year ago

      Well it's hard to argue against the idea that the final aim of Copilot is to automate as much of software engineering as possible.

      I don't think that their aim is explicitly to replace humans, but to be as helpful as possible to developers, but doing that does exactly the other thing

eterps a year ago

I am guessing that at some point a class of programming languages will be designed that is both optimal to generate by LLMs and easy to read/understand for humans. Right now most PLs are optimized for writing code by humans.

  • AlexandrB a year ago

    > easy to read/understand for humans

    I think many languages are actually already optimized for this. Where "reading" means reasoning about what the program does and not just reading text.

  • gwoolhurme a year ago

    That is what makes them also optimal for LLMs as is. It would be more efficient to have something that is direct machine code, but LLMs are Large <Language> Models right?

  • brookst a year ago

    Do we even need such a language, beyond just English / other human languages?

    IMO we’ll see LLM interpreters / compilers where the spec is the code, and whatever the LLM does in the background is opaque to the “programmer”.

azemetre a year ago

It doesn't say it in the blog post, but sign up to the preview does require a paid subscription to copilot.

Unsure if this applies to those who are given copilot for free (doesn't apply to me as I pay).

  • cristiioan a year ago

    I was able to apply with my student account(it has copilot and pro for free).

chrisan a year ago

Is there a way to talk to copilot after something has been made? For example: I just asked it to search for something on a public api in a comment and let it auto complete the function.

The resulting code was pretty decent except it included a hard coded access_token in the url. This was a blank project, I don't even have an access token for this api - guessing someone posted theirs in a public repo at some point.

Is there a way to talk to copilot at this point to refine the code?

  • muny a year ago

    What I normally do in cases like this is set up the necessary scaffolding for the solution. In your case, that could be introducing a variable named something like `access_token` for it to substitute into the URL.

    Then, when you trigger a suggestion it's very likely that CoPilot will use your `access_token` variable.

  • drBonkers a year ago

    I think this is what Copilot Chat will do.

    • chrisan a year ago

      ah yes, you are correct. Visual Studio (code) only at the moment

pornel a year ago

The Copilot Voice dictation experience looks way better than the current state of the art of "type keyword import space quote foo quote semicolon newline".

nemo44x a year ago

Wow, Microsoft was so ahead of the curve when they bought Github. They saw this coming and knew they had to acquire them to make it into this kind of thing. Just absolutely impressive execution here. Especially since Github would not have been able to do this on its own. Just an absolute perfect acquisition here. They don't often go right, but when they do so much new value is created for everyone.

  • singularity2001 a year ago

    I am praying for some alternative platform to take the throne off Github. This is way too much power in the wrong hands. The competing product needs to add a fundamental new idea or feature. Gitlab as it is doesn't cut it.

itsaquicknote a year ago

Ouch, this nukes a few startups I was watching working on "basically this". What's the plan control.dev and cursor.so?

c7DJTLrn a year ago

I can't believe how quickly things are moving. In five years we're either going to look back on this as a hysterical hype bubble, or one of biggest innovations in the 21st century.

I have been skeptical of Copilot from the beginning, but it's becoming clear that I'lll be left in the dust if I don't try at least try it.

UK-AL a year ago

I think they've done it backwards in regards to it writing tests. Tests are the check to make sure the A.I is in check. If A.I is writing tests, you have to double check the tests.

You should write tests, then the A.I writes the code. It almost doesn't matter what the code is, as long the AI can regenerate the code from tests.

  • layer8 a year ago

    Tests don't (can't) prove tthat code is correct. They are merely a rough plausibility check that the code isn't completely wrong and didn't regress. You generally can't derive the right code just from tests.

    • UK-AL a year ago

      You can write tests about properties you care about which may not be everything. Generally in some of the more financial applications i've written I would be ok with people rewriting the app as long as it passes the tests.

      I've even written tests that say this set of input goes to this output, for various different subsets of input. Anything outside of the of the defined input sets fail validation.

      Than it randomly picks a couple of thousand inputs from the input sets I've defined and runs them. More confidence you need, the more exhaustive setting you put it on.

      It's a bit like QuickCheck.

      • layer8 a year ago

        You can approximate it, but to represent really all properties, in the end it becomes a mirror picture of the actual code you are testing, which then begs the question.

        A random sample of inputs that is hidden from the AI also won’t allow it to derive a corresponding implementation. And if the set of sample inputs is not hidden, then the AI is still free to produce an implementations that only works for those sample inputs.

        • UK-AL a year ago

          You'd probably separate example tests and validation tests. Also test descriptions should fed into the prompt to help guide it, like BDD style tests.

          On test failure, the data is fed back into the prompt about what failed for another iteration.

          This will help avoid over-fitting, and generate another generation on test failure. I mean you can't guarantee correctness, but you could probably get it pretty close. Humans also have the same problem.

Jack5500 a year ago

So wait, is the switch to GPT4 done now or will it be done with Copilot X? The wording seems unclear to me

the_other a year ago

It should be free, or at least significantly cheaper. It was trained on millions of person-days of work people shared for free. This is _our_ work being sold back to us.

(ok, not really mine, I don't contribute to OSS very much; I should probably pay for it.. but you get my point).

  • cal85 a year ago

    As an OSS contributor I'd love it if GitHub decided to share their profits with me, but I did knowingly contribute my work under free software licences so I don't see myself as having any moral or legal right to that. (If they've trained it on private repos that's another matter, I am assuming they haven't.) Either way, I don't think it's a strong argument to say it's our work being sold back to us. Whatever it is that makes people pay for Copilot, it can't just be the OSS code that it was trained on, as that is all still freely available online. People must be paying for the part that GitHub has trained/built, the part that reads your codebase and makes good suggestions.

criloz2 a year ago

Why people will upload their code to GitHub (if they are working in some state-of-the-art tech) is beyond me, you will be easily replaced by microsoft and they will say that was a product generated by gpt, see what happened in Amazon store with the popular products

noworriesnate a year ago

This will be a hard moat for GitLab to cross. There is so much value to having open source models though that I wonder if we could come up with a project on the scale of Linux that is for open source machine learning models. So many companies could benefit from it.

trashface a year ago

Funny how if AI puts programmers out of a job, they won't be able to afford all these AI coding tool subscriptions. I'm unemployed (and probably, now, unemployable) and I can rationalize/afford the $10/month for regular copilot, but not more.

visarga a year ago

> A ChatGPT-like experience in your editor with GitHub Copilot Chat: We are bringing a chat interface to the editor that’s focused on developer scenarios and natively integrates with VS Code and Visual Studio. This does far more than suggest code. GitHub Copilot Chat is not just a chat window. It recognizes what code a developer has typed, what error messages are shown, and it’s deeply embedded into the IDE. A developer can get in-depth analysis and explanations of what code blocks are intended to do, generate unit tests, and even get proposed fixes to bugs.

This is what I am excited about - seeing its errors and other things it needs in order to accomplish tasks. Can't wait to see it in action.

electrondood a year ago

> GitHub Copilot Chat builds upon the work that OpenAI and Microsoft have done with ChatGPT and the new Bing.

Cool, so it can sass me and threaten to stop responding when it doesn't like my tone?

Whatever PM decided that tools need to have a personality should be fired and blacklisted.

Toutouxc a year ago

Is anyone actually seeing things IRL like 46 % of code written by Copilot? I'm using it in RubyMine in a few years old Rails codebase with some complex bits, and while I find it much more useful than RubyMine's already nice autocomplete, I can rarely get it to write more than a one-liner before it misunderstands my intentions and touches something it shouldn't.

One thing that I think contributes to this is that the codebase is really messy in places, so often there isn't much good code to parrot, and things that are named like should only do thing A often do thing B on the side.

  • bottlepalm a year ago

    Probably more than 30% for me, and there’s a lot of low hanging fruit I bet it could get up to 60% at least. This is a full stack app. It knows from context what you’re going to next in many cases.

ComplexSystems a year ago

There has been this bug with Copilot in VSCode for just about forever where it only suggests one line at a time. People have been asking for this to be fixed for months and months: https://github.com/orgs/community/discussions/40522

Do the people here in the comments section, who seem to be getting good results with Copilot, somehow not have this issue? If so, what is the workaround?

devit a year ago

It's a bit annoying that you have to pay for Copilot just to sign up for the waitlist, rather than being able to either pay for acceptance or start paying when you are accepted.

aldarisbm a year ago

At which point do we not need to understand what is in front of us?

eg: The Code.

I was using chatgpt, with a groovy project (havent used groovy in years), I prompted chatgpt for an answer, and it gave me a solution. The solution (to me) was a bit obfuscated, but it worked. Since I'm only using groovy to figure out this one task, it would've been extremely easy for me to not check what the solution did, and just keep going with my day. Luckily, I'm curious and wanted to know how things worked.

What if we increasingly stop to care, because of the confidence instilled by the media.

"It can make errors" but also "This show SIGNIFICANT improvements over last generation (which tbh was pretty okay)"

A lot of dis/misinformation.

I like using copilot/chatgpt, but it's incredibly hard to believe that we wont just become 100% reliant on this tool, and coding will be something akin to assembly code to engineers.

  • jnovek a year ago

    One hypothesis I have is that the future of programming is natural language and the “compiler” is a LLM. This would be no different from the days when C took over for asm.

    • layer8 a year ago

      The difference is that natural language doesn't have a well-defined semantics and is open to interpretation. Programming languages being compiled, on the other hand, have precise, well-defined semantics (excluding UB I guess) that you can formally reason about.

      Using an LLM for programming is not like compiling, it's like employing a developer you either have to trust how they may happen to interpret your instructions, or to manually double-check their work.

      • jnovek a year ago

        I understand where you're coming from, but I didn't mean it quite that literally. More like, in the same way that compilers fundamentally changed how we program by adding a layer of (deterministic) abstraction, natural language assisted by AI could add a new (non-deterministic) layer of abstraction to programming.

    • NeuNeurosis a year ago

      This was my take on the tech. Ultimately it will need to be prompted. Now the form that prompt takes is open to a bunch of speculation but even if it gets superhuman at producing code or any output it is directionless by itself, for now. We are what direct its ability, which I think is really cool. I think getting good at extracting exact outputs from the LLM will be the new engineer.

qwertox a year ago

> ChatGPT-like experience in your editor with GitHub Copilot Chat

The best thing about VS Code is how you can click on the tab of the current editor and drag it out into a new self-contained window, so that you can move it to another monitor. This is such an optimal use of screen real estate.

I will drag out Copilot from the main VS Code window to my right monitor, have the main editor in the center monitor, and Google / GitHub / API docs on the left monitor.

Yes, I'm being sarcastic.

  • awestroke a year ago

    What's your point?

    • magospietato a year ago

      VS Code constrains a single IDE instance to a single window. Extra flyouts like Copilot chat can only exist in this window, impacting on available real-estate for the code we actually care about.

      It's Code's biggest weakness IMO.

    • qwertox a year ago

      Why not offer parts of it as a website? The part where I chat with it, where I ask my questions and get my answers. (Maybe it could bridge into VS Code via a local server to exchange code.)

      • awestroke a year ago

        There is already a website. It's on chat.openai.com

        • qwertox a year ago

          "GitHub Copilot chat is not just a chat window. It understands what code a developer has typed, what error messages are shown, and it's deeply integrated into ..."

          Could you please stop?

retinaros a year ago

do you see this as a tool to increase productivity in a way that shareholders ask for leaner team or in a way that it creates more demand for products and devs?

  • swader999 a year ago

    One should expect both.

    • dangerwill a year ago

      Increased productivity usually leads to cutting team size instead of expanding scope. A lot easier to justify doing the same work you have been doing with less, as opposed to doing new work with the same number of people.

      • anticensor a year ago

        Yes, possibly because it is riskier to get to more fields of work than do the same work with less effort.

    • digdugdirk a year ago

      But since the question was asked about shareholders specifically, the guaranteed bottom line improvement is going to come from cutting labor and running leaner teams. That's where large corporations will go first.

      • geraneum a year ago

        That's the main idea behind developing such systems. It's telling execs: Hey, look, you are paying a lot of money to those pesky workers (i.e. developers) and we like that number. Give us a cut and we save you a lot of money. This is nothing new actually. Has happened over and over in other industries and is not really surprising. What happens is that a lot of added value that the developers were generating will be consolidated into the pockets of a few and the rest will be automated. They businesses cannot be upset about this because the ultimate goal of a business is not be your friend but to generate more money and one way is to reduce costs.

bpodgursky a year ago

No JetBrains in the preview :(

  • gbuckingham89 a year ago

    This is a big disappointment

    • mewmew07 a year ago

      how is this big?

      it's a preview for new features, calm down

      • bpodgursky a year ago

        It's big because I want it now!

digdugdirk a year ago

With increases in productivity from systems like this, and a guaranteed bottom line improvement from cutting labor and running leaner teams, has anyone here heard of any large corporations using LLM's as an excuse for headcount reduction yet?

In all honesty, I wouldn't be surprised if this has already played a large behind the scenes role in the layoffs at Google/Microsoft.

  • shmatt a year ago

    I work at a big tech company and there is a complete ban on coding using LLMs. The fear is sending our code for the model to learn in the future. This is also why then hype around GTP4-powered Office is overhyped. Very rarely will a corporation be OK with GPT-4 running inferences on their internal documents and excel sheets

    Until an on-prem solution is offered, many of these companies won't get anything out of the technology

    The layoffs at big tech are caused by un-even profit generating, Where 5% or 20% of the company generate 99.9% of income. The rest are essentially startups being funded by rich VCs. Eventually if you don't find PMF the VC cuts the funding

    • blackbear_ a year ago

      > Very rarely will a corporation be OK with GPT-4 running inferences on their internal documents and excel sheets.

      Which is why Microsoft will be able to charge outrageous fees for enterprise plans that keep employees' queries to GPT private and out of future training sets. I do not think on prem will be successful, though, the hardware requirements are fairly substantial and specialized.

    • dougmwne a year ago

      How do you reconcile what you just said with the massive popularity of cloud services? The source code is already sitting in the same data center the LLM GPUs are sitting in.

      Of course there are some businesses too paranoid to use the cloud, but even intelligence agencies are on board these days.

      • xdennis a year ago

        Probably because people view the cloud as their data in someone else's hands, but AI companies view data as fair game for ingestion, since "they just learn like humans, and you allow humans to view your data, right?".

      • frenchy a year ago

        With cloud services, there's some expectation that your stuff is private. Copilot is like doing all your cloud development with an anonymous FTP server.

    • walthamstow a year ago

      > Very rarely will a corporation be OK with GPT-4 running inferences on their internal documents and excel sheets

      Aren't they already doing this, if your documents are in Sharepoint or Google Drive? I have assumed so, but with no basis

    • maeil a year ago

      For big companies, that on-prem solution is coming sooner rather than later. For small companies, plenty of them (likely the large majority) will be very willing to take the "risk" for the save in labour costs.

  • jstx1 a year ago

    In general demand for programmers has been growing over the long term. So if the baseline is an upward trend, these tools can have a demand-reducing effect and the overall demand can still be going up just at a lesser rate.

    Or it could be flat or go down - my point is that one thing reducing demand doesn't necessarily mean that demand is going down.

  • manicennui a year ago

    If your job is nothing but writing the kind of boilerplate code that Copilot gets somewhat right, you probably should lose your job.

  • fnordpiglet a year ago

    Everyone I know is understaffed. I think the outcome is roadmaps deliver more reliably not reduce headcount.

    • moneywoes a year ago

      Seems strange with all the layoffs

      • zeroonetwothree a year ago

        Sometimes thinking that you are understaffed is actually caused by being overstaffed. The reason is that having a lot of people introduces tons of overheads and coordination costs, that reduce your time for your main work.

        • symlinkk a year ago

          Lol the cope here is unbelievable. Let me guess, the people that are “just overhead” and “reduces time for main work” are everyone else, not you, right?

      • fnordpiglet a year ago

        That’s companies preparing for the artificial recession the fed is launching us into to ensure Twinkie’s don’t cost $0.05 more.

javier_e06 a year ago

I asked ChatGpt in plain english to cook some bash scripts for me and the examples are good. It replies in plain english and then generates working code that does what you ask it to do and then you can ask it to refactor or to change and re generate. Very impressive. Who remembers that ${my_string%%myword} shaves-off the word at the end of the string anyway?

unity1001 a year ago

I dont get it - has the copyright problem with AI generated code trained on other people's repositories been solved yet?

endisneigh a year ago

it's interesting to try and reconcile the anti-big-tech zeitgeist with the pro big-tech AI offerings such as this. fun times

renewiltord a year ago

I currently use copilot in my terminal by using copilot.vim and then using C-x C-e in my command line to bring it into my editor but I have to exit and re-enter to get traditional autocomplete which I find annoying. Paths come easy since neovim will complete this for me with Deoplete but the rest is not straightforward.

noltncy a year ago

This is good strategy. The source data, the tech to go through the data, the customers to sell the results back to - it’s all there.

Next up, AI in games through their own studios, activision Blizzard and the xbox platform.

Videos/video based content is something Microsoft doesn’t have. It’s mostly confined to ‘stream’ and teams.

symlinkk a year ago

Everything I have worked for is worthless. The programming knowledge I sacrificed my prime years to learn is obsolete. Everything I earned I used to buy a house which is declining in value day by day. I wonder if I would be happier and more successful if I didn’t take the responsible path in life.

hislaziness a year ago

Copilot with GPT-4 model, chat and voice, pull requests, command line, and docs to answer questions.

  • nico a year ago

    Are you doing that? Any specific tools/services you use for it?

yodon a year ago

After finally switching to a Mac full time last year for the better developer experience (bash command line without the WSL disk access slowness), I now feel like I need to install Parallels to get the better developer experience (Copilot and Copilot X under Windows)

kumarvvr a year ago

A curious question. Would it be possible to build LLMs off of bytecode / compiled code, and then translate it back to different programming language codes?

Is that a viable way to get more fine-tuned outputs from these models?

  • mad0 a year ago

    I don't think so, bytecode / machine code must be really precise, while LLMs are producing nondeterministic outputs (at least GPT). Add "hallucinating" of the output to the pile and right now I think it's infeasible.

retrocryptid a year ago

So if MSFT's message here is "software engineers don't have to know how to code," what's the value of knowing how to code? Seems at odds with their core customers' self interests.

  • make3 a year ago

    it's totally not the message, weird that you would say this; this is a tool just to make the developers' lives easier

    • retrocryptid a year ago

      That doesn't seem (to me) to be how it's being advertised to corporate clients.

zzzzzhzzzzz a year ago

"At GitHub, our mission has always been to innovate ahead of the curve and give developers everything they need to be happier and more productive in a world powered by software."

Maybe start with repo folders?

dougmwne a year ago

Something interesting is that there is no mention of fine-tuning. Is GPT-4 so good at general tasks out of the box that fine-tuning is obsolete? Can one model really be equally good at Python and poetry?

  • og_kalu a year ago

    Finetuning has been more or less obsolete for LLMs since the introduction of GPT-3. This was a major discovery of the original paper. Codex-Davinci only existed because the original gpt wasn't trained on much code not because a version trained on sufficient code would need special code optimizations/fine-tuning.

    and yes, LLMs are general intelligences. there is nothing narrow about them at all

  • flangola7 a year ago

    >Is GPT-4 so good at general tasks out of the box that fine-tuning is obsolete? Can one model really be equally good at Python and poetry?

    This is correct. It is able to generalize across domains intelligently. Given the results of fine tuning in earlier models, fine tuning GPT-4 for a specific purpose will probably produce super-human ability.

mmaia a year ago

Huge impact for commercial open source projects. I'm thinking that it could even change incentives for companies to open source more projects.

ivxvm a year ago

So it still can't write new code according to project's patterns / conventions and "core" library?

DeathArrow a year ago

Microsoft lost some trains trains with mobile and servers. But they won with AI, developer tools, cloud and gaming.

ertucetin a year ago

It seems as though all the companies deliberately agreed to wait until 2023 to release their AI products

vslira a year ago

A bit sad that they’ll go the price discrimination route instead of improving Copilot

Gotta pay for those H100 I guess

throwaway4good a year ago

Voice support? Can I get this thing to call into a teams meeting? (Asking for a friend.)

zerop a year ago

So we will have to train our developers to produce code using gpt prompts.

I see many jobs are at risk.

amelius a year ago

Stackoverflow should be afraid.

  • erdaniels a year ago

    If you think about it, it's likely that much of the code Copilot is trained on comes from stackoverflow Q/A that made its way into open source GtiHub projects

bilsbie a year ago

So how do I get started with this. Explain like I’m a vim cgi-bin programmer.

ThereIsNoWorry a year ago

This does not replace anyone, it just gets rid of people that don't adapt. I bet any company and developer that won't be using AI augmentation to increase productivity (not just coding) will horribly underperform in less than 10 years.

yanis_t a year ago

On the negative side, no mentions of the Vim plugin update so far.

  • simonvc a year ago

    vim mode in vscode is good enough honestly. i've not booted vim for anything other than quick hacks in ages.

    99% of the time i `code` now instead of `vi`

johlits a year ago

Next step: Don't even show me the code. Just the results.

debugdog a year ago

Well, loving "tldr"[1] and smashing the tab key for my autocompletes that Copilot CLI thing kind of sounds like another level of laziness and less Googling.

[1] https://github.com/tldr-pages/tldr

thoughtpeddler a year ago

I can't help but think that these new Copilot offerings, when fully implemented in 2-3 years or so, make project managers / program managers / release managers obsolete.

cpb a year ago

Better PR descriptions explain why.

messel a year ago

I've got LLM fatigue

eawlot3000 a year ago

the future is here. lets gooo

la64710 a year ago

This is clearly a case when an idea is oversold and everybody starts to blabber the same thing. Lose all the arguments to git cli because of an AI chat interface?