top of page
  • Writer's pictureKarl Hughes

How DevRel Teams Can Use AI Today and Tomorrow

It’s impossible to ignore the hype around AI right now, and all of us working in Developer Relations have been grappling with the right ways to leverage it. This year’s DevRelX Summit was largely focused on AI, and one of the panels I found most interesting was the one on How DevRel Teams Can Use AI.


Moderated by Ash Ryan Arnwine from Nylas, the panel featured insights from Kerri Shotts of Adobe, Jon Gottfried of Major League Hacking, and Joyce Lin of Postman. In this post, I want to dive into their thought-provoking conversation and explore some of the ways DevRel teams are currently using AI to enhance their work and products.


Whether you're a developer, a member of a DevRel team, or simply intrigued by the intersection of AI and software development, this discussion offers valuable perspectives on harnessing AI's potential today and envisioning its role in the future of DevRel.



How DevRel Team Are Using AI Today


The first topic the panelists dug into was how DevRel teams are most impacted by AI today. Because all the panelists are building products and extensions specifically for developers (i.e., developer tools), all of them have been exploring using AI to enhance their products. From direct integrations that enable auto-complete to custom-trained models that help developers explore documentation, AI is becoming part of almost every developer tools’ product portfolio.


On the other hand, the panel also discussed ways that they’re using AI to help their developer relations teams do their jobs more efficiently. While there are limitations to large language models (something I’ve noted before), they can supplement the work that DevRel teams are doing in some interesting ways. Let’s dive into some of the discussion on both of these AI use cases for DevRel teams.


Question 1: How are you seeing DevRel teams use AI already?


The first point brought up by Jon Gottfried at Major League Hacking is that AI is changing the way developers write and debug their code:


“One of the things we’ve seen is AI replacing some aspects of pair programming and one-on-one support. We see a lot of developers posing questions to Chat GPT, which is interesting.”

One of the biggest limitations to this use case is that AI models might not be correct. Often, the training data is out of date or because there is little information available on the topic, the model may just make up answers rather than admit that it’s not certain.


While inaccurate answers might be okay for more experienced developers who know to question the model and interweave their own knowledge, giving less experienced developers incorrect but confident answers to their questions is downright dangerous at times.


Kerri Shotts of Adobe pointed out that if the AI can’t give accurate answers, users have to be familiar enough with the API docs that they know how to parse information for correctness. Obviously, this is frustrating.


With this use case in mind, DevRel teams are tuning or training models on their docs to ensure that the answers users get from AI chatbots are as accurate and up-to-date as possible. This helps overcome the correctness problem, but it still doesn’t make AI a perfect tool.


Another important factor Kerri pointed out is that the context matters. A person typing a question into Chat GPT may expect different answers from a developer writing code into their IDE or terminal. By building AI into specific product experiences, you can give the model more context than by relying on purely general models.


Finally, Ash pointed out that AI is also being used as a discovery tool by developers, and that DevRels (especially those with KPIs around growth) should be aware of this. He noted that a developer came to Nylas through a Chat GPT recommendation, and that while this is good, it’s also not a channel they have much control over yet.


Question 2: How are you seeing developers use AI in conjunction with your API?


With his unique experience running hackathons, Jon noted that he sees a lot of developers using AI to figure out how they can use a particular tool or leverage its full capabilities.


This really resonated with me. I always hated spending hours reading through a tool’s entire documentation site just to figure out what it was really capable of, but a well-trained model could act as instant support for developers in that situation. This frees up DevRels from having to answer all these questions manually and it means users can get real-time support without expensive full-time staff on call at all hours.


Ash pointed out that building custom models used to be the only way to accomplish this, but now GPT-based tools can be built using OpenAI’s API, making the cost to develop and maintain custom models much lower.


Joyce’s perspective was also interesting as Postman is one of the most widely used API management tools out there. While they have AI-based documentation tools, she pointed out that in general, the AI tools available to developers are more mature than those made for DevRel use cases:

“I would love for it to be able to help me write a conference talk faster, but the human aspect is still not quite there.”

This mirrors my experience too. Last year, I tried out several of the top AI writing tools, and found that none of them could really replace a few years of personal experience when it comes to technical depth.


This response served as a nice segue into the next question…


Question 3: How are you using AI models to specifically help DevRels do their jobs?


Right off the bat, everyone agreed that AI isn’t coming to replace DevRels anytime soon. Ash and Kerri pointed out that you have to use your own experience and judgement to filter out the information LLMs generate, and you should still double check everything it gives you back.


Joyce added that cutting edge or niche topics are especially problematic for off-the-shelf AI models because LLMs can only draw from existing content. Because keeping documentation and blog posts up-to-date is a huge challenge for DevRel teams with older products, this makes the current state of generative AI minimally useful for many.


That said, it might help you find things you might be missing or get started on something new.


The “blank slate” problem was brought up a couple times in this discussion, and that’s where I’ve found generative AI to be most helpful too. I wrote the first draft of the introductory paragraph for this blog post using ChatGPT, and then revised it heavily to make it sound more like my writing. Despite writing thousands of blog posts in my career, I still leverage AI sometimes to help me get a new document started.


Question 4: How are you leveraging AI for onboarding?


As use cases were discussed, Ash brought up one especially tricky problem that they’re working on at Nylas: new user onboarding.


In theory, AI could build bespoke onboarding flows based on the developer’s experience level, use case, company size, stack, etc. In practice, giving AI enough of this information to be useful makes this hard to do.


Kerri pointed out that skill level matters a lot in onboarding and in giving answers to common questions. A newer developer might get an incorrect code sample from AI and have no idea how to start debugging it, while a senior developer might not need as much boiler plate code to work from. The worst case scenario is that AI slows down onboarding or frustrates new users. You could try to gather this information through a form, but you still have to worry about getting the wrong information or burdening users too much early on.


Retaining context seems to be part of the solution though. Joyce pointed out that GitHub’s Copilot X retains its knowledge about the user and their interactions with the IDE to help it give more personalized responses. Ash noted that a transferable configuration (like a JSLint file) for any AI coding app would be an interesting solution too.


Question 5: What will AI integrations and tools look like in the future?

Ash pointed out that throughout this topic, we’ve been talking about AI in the form of chatbots and text autocomplete, but these aren’t the only forms the technology could take. For example, what would it look like to have CLI-based AI tools that returned non-deterministic results based on their understanding of context or the user’s network configuration?


Jon added that we’re starting to see it used less transparently in things like dynamically generated content on pages. For example, a tool’s homepage could generate different text based on who’s viewing it or how they’ve interacted with the site before.


He also pointed out that more programmatic use cases will help alleviate some of the back-and-forth currently necessary. He mentioned a use case where he’s building a tool that will look at a code sample and error message, recommend a fix, and then create a GitHub issue with the fix suggestion. Currently, this workflow would be pretty manual, but it could be automated or built into other tools in the future.


Kerri pointed to two ways that developer tools will use AI in the future. First, they’ll help guide users to the right solution or point out mistakes proactively. This could be very helpful if you’re new to a tool or working in a new stack. Second, AI could be used to suggest “happy paths” in onboarding, helping new users see what else they can do with the tool or analyze any risky patterns they employ.


Conclusion


While the AI hype cycle has probably peaked, we’re still in the early days for generative AI technologies. Joyce pointed out that similar to buzzwords like “low code,” chatbots have promised a lot, but we haven’t seen the end of what they can offer.


AI has the potential to both open up coding to non-developers as well as help users who “don’t know what to ask” because AI can decipher fuzzy contexts much better than traditional structured queries.


Finally, DevRel will undoubtedly find new ways to leverage generative AI as the technology matures. Whether it’s helping us update or generate documentation, create technical tutorials in new programming languages, or respond to users faster and with more correct information, we’re just scratching the surface of AI and developer relations in 2023.


About Karl Hughes

Karl is a former software engineer and CTO. He’s currently the Founder and CEO of Draft.dev, where his team has helped 150+ developer tools companies create compelling, technical content at scale. You can reach out to him on Twitter/X or via email.

bottom of page