The number (and frequency) of articles and videos about Liquid Glass has increased in the past couple of weeks, and I’ve included a few relevant blog posts in this issue. Having worked on making my Second Brain app ready for iOS 26 while maintaining backwards compatibility, these articles were an important source of information.
I’ve also noticed that some ways to maintain backwards compatibility in SwiftUI require us to write more code, resulting in harder-to-maintain code bases. In this week’s livestream, I will look into ways to make this better.
Vibe coding is also gaining more momentum. It’s often been said that generative AI and vibe coding democratise computing. The result is a whole new dynamic in the developer ecosystem, which requires us to create content for two very different audiences: people who are completely new to computing, and LLMs. Yes, you read that right - we need to start thinking about creating documentation (and other content) in a way that makes it easier to consume for LLMs. My colleague Michael Bleigh has written a great piece about this - or has he? Read on to find out.
And if you’re interested in how Firebase Studio allows you to vibe code in the cloud, check out the recording of Firebase After Hours #14, in which we live vibe coded a retro calculator with Rody Davis (DevRel on Firebase Studio).
Thanks for being a subscriber! If you like this newsletter, forward it to a friend or a colleague - it would mean the world to me.
Let’s be honest - even though we’d ideally all just support the latest APIs in our apps, not everyone will be on the latest iOS version on day one. But how can you use the new, shiny Liquid Glass APIs, and still make sure your app looks and works great on previous iOS versions?
In this episode of my “Building a Second Brain app” livestream series, I look a couple of strategies that might be able to help:
We’ve started creating a series of videos that explain all aspects of using Firebase AI Logic in your mobile and web apps. Firebase AI Logic is a Firebase SDK that lets you call the Gemini (and Imagen) APIs from your mobile and web apps in a secure way.
In this first video, I walk through the setup process, and how to generate text. In the next videos, we will look at more advanced concepts like tool calling and implementing chat - stay tuned!
To make it easier to follow along, we’ve created a sample app: Friendly Meals, an AI-powered recipe planner - the iOS version uses Liquid Glass - of course! You can find the code in this repository.
Majid has been publishing a whole slew of articles about applying the Liquid Glass effect in your apps - most of them focus on individual SwiftUI views, such as tabs or toolbars. This article here goes beyond individual controls, and zooms in on what you need to know when using Liquid Glass in your custom SwiftUI views.
Speaking of creating custom SwiftUI components that make use of Liquid Glass, here is Donny with a custom radial menu (as popularized by the Path app).
Donny emphasises Apple’s guidance to use Liquid Glass sparingly - for elements that sit on top of the UI, such as toolbars or floating menus. He then walks through the process of creating a radial menu (with animations no less) in less than 20 lines of code. SwiftUI never ceases to amaze me.
Michael provides an excellent guide on writing documentation specifically for LLMs. As AI-assisted coding becomes more prevalent, it’s crucial for library and framework creators to provide effective guidance that AI models can consume.
Some of the rules even make sense for when you’re writing documentation for humans, for example “Positivity - provide positive examples, rather than negative ones”. I couldn’t agree more: No matter what you do - use a big fat stop sign, put a stern warning inside a red box, or even strike through the code, people will copy and paste your negative examples, so it’s better to not provide them in the first place.
What really put a smile on my face was that Michael went deeper and provided a rules file for writing rules about rules. So - did he write the blog post himself, or was it the AI? What do you think?
Peter has not only been busy vibe coding tools for vibe coding (all of this is very meta), but he also put together a repository of rules files for all aspects of the development process. A great resource for anyone looking to take their vibe coding game to the next level.
In Xcode 26, Apple introduced a new feature called Playgrounds which allows you to play around with code. In this repository, Ivan has gone to great lengths to use Playgrounds to exercise all of the foundation models features. Really amazing.
This comprehensive collection contains over 100 Swift playgrounds that demonstrate practically every capability of Apple’s Foundation Models Framework. The playgrounds are organized into logical categories, making it easy to explore specific use cases. Definitely worth checking out for inspiration.
I’ve been using Xcode for (checks notes) more than 15 years now, and I thought I knew my way around it. However, this article still taught me something new! It’s a great collection of Xcode shortcuts and other hacks that will make you more productive. One of my personal favourites is CMD + Shift + J (Reveal in Project Navigator), which highlights the current file in the project navigator. Super useful when you’ve lost your way when navigating around the code base using CMD + Click, or (and this is the one I had forgotten about) CMD + Ctrl + J.
This is a great blog post about public speaking in the context of tech conferences (big or small - a meetup is a conference as well!).
I’ve had the pleasure to see Rob present at a couple of conferences (most recently at iOSKonf 25 in Skopje) - he is a great speaker, and he has done a great job condensing his experience into actionable advice in this post.
Rob is an advocate for accessibility, although maybe I should rather say: he cares deeply about making tech accessible for everyone.
It doesn’t come as a surprise that his blog post touches on accessibility as well, but in a somewhat unexpected way. You’ve probably heard the advice “don’t read your slides” many times. Rob argues that the contrary is true and you should read your slides - as in “make sure that people you have a hard time seeing the content of your slides understand what’s on your slides”. This obviously means you shouldn’t put too much stuff on your slides, which is another of Rob’s recommendations.
No matter if you’re new to speaking at conferences or an experienced speaker - I wholeheartedly recommend reading Rob’s article.
We’re well into the second half of the year, and here is your updated list of conferences for iOS, Swift, and AI. I will be speaking at a few of them, looking forward to seeing many of you IRL!
This video is a fascinating look back at a 1960s film predicting the future of telecommunications. They actually got a lot of things right - from answering machines, pagers, to video telephony, telebanking, and other “computer services” - although I have to admit the way they made hard copies looks a bit hilarious in hindsight.