Continue Newsletter: Monthly Updates

Continue Newsletter: Monthly Updates

April 2024 Updates

Continue Updates

🔴 Significant improvements to cmd/ctrl+I in VS Code: better prompt for all models, easier to undo changes, accept/reject one-by-one, cleaner diffs, and more

🟡 Personal usage analytics! Continue now locally tracks the tokens you generate each day, so you can view the breakdown per day and per model.

🟢 Our new @code context provider makes it easy to reference functions and classes

🔵 Tab-autocomplete is now available in JetBrains! Let us know what you think on Discord

⚫ Continue community members contributed a GitLab MR context provider, a “local variables” context provider, v2 support for the Jira context provider, and more in March

Code x AI Updates

🐞 Gradle posted AI + Engineering = Magic at Airbnb and From Myth to Legend: How Generative AI can Supercharge Productivity to Create 10x Developers at Uber from the 2023 DPE Summit in March

🐝 Justin Milner, a Continue community member, wrote A Guide To The Booming Landscape Of Coding Assistants that provides an overview of AI software development systems

🐊 Ty spoke at Data Council last week about how to use your Continue development data to improve your AI dev system (slides)

Upcoming Events

1️⃣ Continue weekly office hours will still be on Wednesdays at 10am PST in April

2️⃣ We will be hosting an open-source AI meetup with Ollama in San Francisco in April

3️⃣ We will be attending Heavybit's DevGuild: AI Summit II on April 23rd in San Francisco

Did you know?

If you are using a new version of a library or framework, you might run into issues with LLM knowledge cutoffs. You can get around this by using the "docs" context provider to add information from any documentation site as context automatically

March 2024 Updates

Continue Updates

❤️ Our open-source, local tab autocomplete made it into beta in February

🌊 Lots of new models to try like Gemma 7B, Mistral Large, StarCoder2, and more

🌲 @rootedbox and @NinjaPerson24119 built Postgres / database context providers

@theBenForce contributed another awesome context provider for Jira issues

🚒 The JetBrains extension shares most of its code with the VS Code extension again

Code x AI Updates

🧿 I worry our Copilot is leaving some passengers behind is an insightful perspective on the very real impact on UX and accessibility of using LLM tools while coding

🌱 How can you improve the code suggestions you get from LLMs? dives into what we might we do when we find ourselves getting suggestions that are flawed or wrong

Upcoming Events

1️⃣ Continue weekly office hours will still be on Wednesdays at 10am PST in March

2️⃣ The next local and open-source AI meetup will be in Paris on Thursday, March 21st

3️⃣ Ty will be speaking about development data at Data Council in Austin on Mar 28th

Did you know?

We released tab-autocomplete in beta yesterday. Nate summarized how it was built here

February 2024 Updates

Continue Updates

🐋 New context providers: @directory uses RAG to find relevant files, @problems shows the LLM any lint warnings in the open file, and @docs crawls and indexes docs pages

🚌 Local codebase embeddings are back with support for multiple providers of embeddings, including transformers.js, OpenAI, Ollama, and Together embeddings

🍎 Improved codebase indexing now stays up-to-date continuously, even when you change branches, and supports Remote SSH and Dev Containers

🪴 Image support! Continue now lets you drag and drop images into the chat when using a compatible model (e.g. gpt-4-vision-preview or llava)

Code x AI Updates

✈️ The State of Developer Ecosystem 2023: Artificial Intelligence by JetBrains

🦙 Meta AI released Code Llama 70B this week. You can try it out in Continue with Ollama, Together, Perplexity, and other providers now

Upcoming Events

1️⃣ Continue weekly office hours will be on Wednesdays at 10am PST in February

2️⃣ We will be co-hosting another open-source AI developers meetup in SF on Feb 12th

3️⃣ Ty will be speaking about development data at Data Council in Austin, TX on Mar 28th

Did you know?

We released an experimental version of tab-autocomplete that uses deepseek-coder:1.3b-base on Ollama by default. The goal is to make Continue's autocomplete entirely configurable. Nate plans to tweet about his process of building it here 👀

January 2024 Updates

Continue Updates

🐍 No more Continue Python server, so the extension now always immediately works

🚒 cmd/ctrl+shift+l enables you to make quick edits without opening the sidebar in VS Code (i.e. edits are streamed in line by line without a side-by-side diff)

〽️ Inline context providers that allows files, the terminal, and everything else to now be referenced naturally as you type

🌊 Three new model providers: DeepInfra, Mistral API, and Google AI Studio

Code x AI Updates

💻 LLMs and Programming in the first days of 2024 is a reflection by Salvatore Sanfilippo, the creator of the Redis database, on how he uses LLMs while coding

💡 You Can Build an App in 60 Minutes with ChatGPT - Ep. 5 with Geoffrey Litt shows how to use large language models as more of a muse than an oracle when programming

Upcoming Events

1️⃣ We will be co-hosting another local & open-source AI developer meetup in San Francisco with Ollama, Replicate, Chroma, and more folks on Tuesday, January 23rd

2️⃣ Continue weekly office hours will be on Wednesdays at 10am PST in January

Did you know?

cmd+m (Mac) / ctrl+m (Windows) starts a new session when you change the topic

December 2023 Updates

Continue Updates

🐥 The model and provider setup experience has been simplified with config.json

📗 Use /so to automatically include Stack Overflow search results in the LLM’s context

🦙 Run local, open-source models easily using llamafile as your provider with Continue

⭐ The Continue repository crossed 5,000 stars on GitHub this past month

🍎 Why we are building Continue: to make building software feel like making music

Code x AI Updates

🐍 LLMs are helpful with Python, but what about all of the other programming languages? attempts to estimate how helpful LLMs are with different languages

📬 What are the accuracy limits of codebase retrieval? dives into different approaches to automatically determining the most important context from your codebase

🤗 Developers are excited about the coding capabilities of DeepSeek Coder (1.3B, 6.7B, 33B) and DeepSeek LLM (67B), which are open source and dropped this past month

Upcoming Events

1️⃣ We will be attending the local & open-source AI developer meetup tomorrow

2️⃣ Continue weekly office hours will be on Wednesdays at 10 am PST in December

3️⃣ The next AI developer tools meetup will be on Thu, Jan 25th, 2024 in San Francisco

Did you know?

You can use the /so slash command to include Stack Overflow answers as context

November 2023 Updates

Continue Updates

🧇 The JetBrains extension was officially released to the store—it is still going through alpha testing, but it has already reached full feature parity with VS Code

🔍 Codebase retrieval is now available in the VS Code pre-release, which allows you to cmd+enter to “talk to your codebase” using vector embeddings and other methods

🏕️ To improve stability before official releases, there are now pre-release versions of the VS Code extension as well as platform-specific extension packages

🖍️ We've added file icons in the sidebar and a more a more intuitive way of viewing selected context items: toggle to view them inline or click to open the file directly

⚡️ To make the UI more reliable, there was a major refactor that makes it more snappy, less clunky, and eliminates having to wait too long for “Starting Continue Server”

DevAI Updates

🧰 What LLM to use? A perspective from the DevAI space walks you through the most popular commercial and open-source models that are being used while coding

🪼 Developers are the first group to adopt AI at work. Here’s why that matters provides some insight into how software development is already evolving due to LLMs

🌲 How to use a Large Language Model while coding shares some ideas for how to think about using LLMs while coding and tells you about the common mistakes to avoid

☀️ Large Language Models for Software Engineering: Survey and Open Problems gives an overview of the state of research on code LLMs as of October 2023

🌶️ Awesome-DevAI is a curated list of resources about DevAI—the community of developers building software with the help of LLMs—that can help you get up to speed

Upcoming Events

1️⃣ Office hours at 10am PT on Saturdays will continue on the Continue Discord in Nov

2️⃣ Ty will be in Berlin from Nov 13th to 17th. Reach out to meet up and discuss DevAI!

3️⃣ The last DevAI meetup in San Francisco this year will be on Nov 30th. RSVP here : )

Did you know?

When you want Continue to start fresh with new context, you can press cmd+option+n in VS Code or cmd+ctrl+n in JetBrains to begin a new session

October 2023 Updates

Continue Updates

🐳 Codebase embeddings let you ask questions about your workspace (experimental)

💫 We started alpha testing the JetBrains extension with early users this past week

🚒 Use /cmd to generate a command and have it automatically placed in your terminal

🐍 We integrated w/ the Language Server Protocol in Python (more languages soon!)

🌊 New LLM providers: LM Studio, HuggingFace TGI, and the Google PaLM API

👑 Make adjustments using cmd+shift+m w/ follow-up edits after a diff is generated

DevAI Updates

🪂 Mistral AI 7B, which "approaches CodeLlama 7B performance on code, while remaining good at English tasks", was released on September 27th

🌳 Imbue announced $200 million in new funding to "build practical AI agents that can accomplish larger goals" by developing models that can code and “robustly reason”

💎 83% of respondents to the state of AI in software development survey from GitLab cited implementing AI as necessary in their software development

Upcoming Events

1️⃣ Office hours at 10am PT on Saturdays will continue on Discord in October

2️⃣ We will be attending Heavybit's DevGuild: AI Summit on October 19th

3️⃣ There will be another in-person meetup in San Francisco on October 26th

Did you know?

You can add not just one but multiple sections of code as context by highlighting each one and pressing cmd+m (MacOS) / ctrl+m (Windows)

September 2023 Updates

What's new?

🦙 Code Llama from Meta AI dropped on August 24th! Many of you began using it through Continue via Ollama, Replicate, and Together, and posts about us ended up on the front page of Hacker News that day as well as the next week.

✏️ There are many new opportunities for customizing Continue, including custom chat templates, the ability to use separate models for edit/chat, a handful of new model providers, and GUI that allows you to quickly swap models, set the system message, or change the temperature

🔁 Tommy Ubongabasi, a full-stack developer from Uyo, Nigeria and Continue power user, wrote about how he uses Continue to accelerate his workflows.

📜 There are lots of new features and improvements to the Continue GUI, allowing you to browse session history (i.e. click the file icon at the bottom to browse and resume past Continue sessions), save groups of context for later with bookmarks, easily toggle between models with the new dropdown, edit past messages, and navigate entirely with keyboard shortcuts (e.g. option+cmd+m to toggle Continue, cmd+shift+m to edit, etc).

🧑‍💻 @bra1nDump delivered huge improvements to the contributor experience by shipping a two-click setup after cloning the repository, the ability to debug from a single window, and an initial foundation for unit testing.

🖥 There are four new context providers: ‘@url’ (reference documentation pages), ‘@diff’ (lets you show the LLM all the changes you’ve made), ‘@search’ (brings the exact same search engine used by VS Code to Continue), and ‘@terminal’ (reference tracebacks or other output in your prompt). All context providers also now work on Windows. Additionally, @bra1nDump improved the relevancy in the '@' dropdown as well as allowing bookmarked URLs for the URLContextProvider.

📦 We launched on Product Hunt at the beginning of the month and finished the day at #9. Thanks to all of you who upvoted our launch post and made this happen!

💪 And last but certainly not least, thank you to everyone else who contributed this month: @elabbarw, @eltociear, @T3tr0, @mshameti, @CambridgeComputing, and @jmorganca!

What's upcoming?

1️⃣ Weekly updates. If the monthly summary above is not enough, join the Discord to see everything that's new each week.

2️⃣ Weekly office hours. We hang out and chat with the community each week on our Discord. Join if you want to ask questions or want to discuss something with us.

3️⃣ First Continue meetup in SF. Later this month, we will be meeting in person in San Francisco with the community. We will post in the Discord with more details soon!

Did you know?

If you run into an error or exception in your terminal, you can pass it to Continue with a keyboard shortcut: cmd+shift+r (MacOS) / ctrl+shift+r (Windows) 🪲

August 2023 Updates

How can you help?

⬆️ Press the "Notify me" button here and then upvote our Product Hunt launch on Tuesday

💬 Join our Discord and chat with us about your experience using Continue

💪 Open a GitHub Issue with a bug you found or a feature request you have

What's new?

⚡ Easily switch LLMs in the Continue config file (thanks to an awesome PR by @lun-4!)

🦙 Run Llama-2 locally with Ollama and even set it as your default (shoutout to @mchiang0610 for the PR)

🐍 Continue is now packaged using Pyinstaller, meaning faster download times and fewer reliability issues

💻 Keyboard shortcuts to manage context, toggle Continue, and more

🐛 Dozens of bug fixes, including PRs from @3cognito on docs links and @sanders41 on Meilisearch client usage

✍️ Improvements to the Continue Docs (e.g. new troubleshooting docs and customization updates)

Did you know?

If you like to be able to see your files and folders at all times, you can drag Continue over to the right side of your window 🤯

If you liked this blog post and want to read more about DevAI–the community of folks building software with the help of LLMs–in the future, join our monthly newsletter here.