But it's wrong: using a LinearGradient resolves to the first color provided. We've to use a Gradient instead, as described in the doc: https://developer.apple.com/documentation/mapkit/mappolyline/4233585-stroke
Did Apple ever mention this DI tool that seems able to be used outside of SwiftUI.View in WWDC23?
I am curious why defined it in App Intents
-Worked on the contract renewal application
-Answered a student’s question about psychographic data for a specific area in Honolulu. #librarianLife
-Received 2 peer evaluations to put in my contract renewal (yay!). So I am somewhat set there. #tenure
-Worked out despite deadlines, since I can’t neglect my health.
-Logged into #OpenSim (after upgrading the viewer), trying to help a friend.
-Definitely not upgrading to #iPhone 15 Pro (#WWDC23 was today)…I have the 14 Pro.
My other, more wild and even more last minute prediction for the #iPhone event is that the #iPhone15Pro will get #StageManager to accompany the #Thunderbolt port, #SamsungDex style. I also made this prediction for #WWDC23 but it makes more sense with a new USB-C iPhone.
Developer question, I’ve been looking into the new #Swift #OpenAPI generator announced during #WWDC23. I have an OpenAPI spec file taken straight from a #Python #Django server, it all seems to work fine inside Xcode using the dynamic build plugin. I’ve noticed though that the various client methods corresponding to API endpoints use the operationID as defined in the schema, resulting in snake case method names such as get_scheduled_events() instead of camel case getScheduledEvents() which would be the proper convention for Swift. This also goes for names of schema components defined in the spec. Is this expected and just the cost of using a generalized code generator, or am I doing something wrong? Is there some way to override this by e.g. defining method and schema model names manually using the CodingKey protocol, or do I need to convert operationIDs and schema names in my spec from snake to camel case? This is my first time ever looking into OpenAPI, so possibly I’m missing something obvious.
📢 Announcing ObservableConverter, a handy new SwiftPM plugin to convert your SwiftUI code from ObservableObject to the new @Observable macro introduced at #WWDC23.
⬇️ Find out how it works and how to use it for your own app in our latest post!
My experiment shows this can work. I simply created a dummy perform() method when compiled in the widget, crossing my fingers that it would never execute as long as it implements ForegroundContinuableIntent.
Though I worry it's not documented so it might stop working at some point. cc @mgorbach : is this well supported?
Is anyone else having this problem on #iOS17, which makes it so the battery isn’t completely charged when you wake up?
It seems to stop being charged at 70%, and it only goes back to charging at the time I wake up (probably when my first alarm goes off or when I unlock it for the 1st time). I disabled Optimized Charging because of this some weeks ago, but the problem persists!
The problem doesn’t happen every day, but it happens a lot (checking my BatLog, it seems to happen every other day with some exceptions (sometimes happening for two days, sometimes not happening for two)). Also, I used a fast charger on a short period of like 4 days and the problem did not happen, so either I was lucky or this only happens with non-fast chargers.
I was still trying to get a more consistent idea of what was happening before filling a bug report, but I’ll fill one today.
PSA: Standby mode does NOT require magsafe charging.
It also works when you put your phone in landscape orientation while charging via lightning.
ICYMI, a guide for the recommend way to watch #WWDC23 videos to learn SwiftData!
How to Learn #SwiftData in 2023!
A little blog post for the recommended order to watch the #WWDC23 session videos
There are other examples. One I came across with a couple months ago was using hashtags to keep up with news out of Apple’s developer’s conference.
So this means that the default behavior now is a cascading delete, which is what they should have chosen from the beginning.
However, I feel bad for anyone who watches the #WWDC23 videos later, only to find that "cascade" doesn't exist.
If there is a service providing compliance to Chinese law dedicated to indie apps on the App Store, are there enough interests?
🎬 iPadOS and iOS 17 beta 5 and macOS 14 beta 5 no longer remove tracking information from Twitter's links. The QUERY_PARAM.wplist file doesn't include Twitter or X anymore.
Previous betas used to remove tracking information from Twitter's links:
Interesting thing about being on the edge. After upgrading to #iOS 17 Beta 5, my app crashed on launch. It didn't make sense. It worked 5 mins ago! Upgrading Xcode to the latest beta and launching it immediately showed that my code wouldn't compile because of changes in declaring #SwiftData relationships between models. Sooo... Time to read the #Xcode change logs (https://shorturl.at/chP56) and see what can be done here. Should be some very simple changes.
Do you know the core of every document-based app is UIDocument? On this #WWDC23 session https://developer.apple.com/videos/play/wwdc2023/10056/ learn how to load, save, and access document content effectively while ensuring thread safety. #AppDevelopment #DocumentManagement
How can UserDefaults, which is only readable/writable by the app itself, be used for fingerprint?
Apps being sold and bought?
New episode is out! and it's just me!
Hear about what I've been working on and my thoughts on all the new #WWDC23 APIs.
I'm releasing a major update to @PleaseDontRain on Sticky Widgets day (also known as Monday).
I made significant changes to the UI based on feedback I got from a design lab at WWDC. I thought that it would be fun to make a blog post describing their feedback and how I adjusted the app.
There's also a big jump in the number of images (and quality) as well as the ability to select themes!
#SwiftUI animation is hard. Harder than it should be.
I appreciate folks who have put out resources (blogs, tutorials, sample code, etc) to help others grasp at the capabilities of the animation system. I've seen some truly incredible work. The #WWDC23 talks on the Animation system were also enlightening.
But it still feels really hard to go from a vision or idea to an implementation. In my opinion, *we need better debugging and introspection tools*. There's too much guess-and-check right now.
According to Gurman, #Apple is planning an appointment-only retail launch of the #Vision Pro in the US in early 2024, starting with stores in major areas like NY and LA. This is not surprising news since the product is going to be limited based on what the reports are.
Question on running Apple sample code; at #WWDC23 they demoed Backyard Birds that uses #SwiftData and the new Widgets. https://developer.apple.com/documentation/swiftui/backyard-birds-sample
Getting an error “Embedded binary is not signed with the same certificate as the parent app.”
The Readme tries to help, but can someone tell me what in the name of Steve Jobs is “the Options tab" in Xcode?
So that whole lower eloquence volume thing on iOS 17 beta 3, turns out it's variable volume if you're using eloquence without headphones. This achievement brought to you by ... I have no clue. It lowers and increases the volume randomly as far as I can tell. IT doesn't happen if you're doing a continuous read. But happens when navigating through the interface.
Finally got watchOS 10 Dev beta 3 installed. Very happy to see that Eloquence doesn't default to higher sample rate on watchOS any more. I'm sort of getting the hang of this new interface. I want to see how it develops. I'm not convinced about how well it will work for efficiency for VO users. It may just be that I need to get used to it.
As reported by @Marco, there is one (possibly 2) new UK English Siri Voices in iOS 17 Dev beta 3. Voice four sounds strange in that it is trying to immitate Cockney. It doesn't always succeed though. Voice three, which I also believe to be new, is possibly trying to immitate Lincolnshire accent. This one is far more consistent. I could be wrong about Lincolnshire part though. That voice sounds good and I can recognize an audiobook narrator.
Even though image recognition models are showing 0 KB of usage, they're working. I'm happy to see potential multiple language support for image descriptions in iOS 17 Dev beta 3.
It must be my memory playing tricks on me. I don't recall seeing speed offset for Eloquence in beta 2. It's there in beta 3.
The problem with words like "Vegan" has been fixed. Somewhat. Now the last syllable sounds a bit muffled. Lol. No idea what they're doing there.
What? You mean to tell me that emoji isn't pronounced "back quote left bracket mo j i" with Eloq? First of the fixes in iOS 17 Dev beta 3. The 50% pitch issue with Eloquence is still there. Eloq is a bit quieter. I think. I can't tell if the dictionaries have been updated yet.
🌪️ Overwhelmed by all the sessions from #WWDC23? If you’re not sure where to start, here’s a list of some of our favorites:
Missing #WWDC23? Us too. 😢
Check out our latest blog post where we reminisced about some of our favorite sessions 👇
So here’s an article answering the most common questions about String Catalogs:
**Big update in Safari Technology Preview 173 — featuring a new Feature Flags window **(much better than Chrome's about:flags!) as well as a new **Develop Menu**, among other improvements.
2nd episode is out! File feedback, goodbye 👋
New video out, today we're talking about why the Apple Vision Pro is dangerous!:
Support the Channel Through Patreon: https://www.patreon.com/linuxlounge
Make a One Off Ko-Fi Donation: https://ko-fi.com/linuxlounge
S3E30: The Tock after the Tick! 💡
We discuss the aftermath of WWDC, the Vision Pro (of course), telepresence in video calls via personas, and app tracking transparency.
This episode is proudly sponsored by https://feedbackbulb.com/wfr
🍎🎧 If you’re still catching up on sessions from #WWDC23, here are a few of our favorites to put at the top of your list:
I'm just beginning to digest the incredible content of WWDC23's “Spatial Computing” category 🫀
Notes from my WWDC23 Core Data lab, mostly about CloudKit sync but touching on SwiftData and a few other things https://useyourloaf.com/blog/wwdc23-core-data-lab-notes/ #WWDC23 #CoreData
Eloquence on iOS 17 (and I bet on macOS) betas is pronouncing certain words with the letter 'a' as ar instead of 'a'. "Meta" is pronounced "metar", "vegan" is pronounced "veeghern", etc. There are other examples.
Tell me I'm not just imagining this. Where in the world does this come from?
For those with experience getting Feedbacks addressed and/or responded to...
If a SwiftUI bug you found in iOS 16 seems to have persisted in iOS 17, is it better to “Add More Information” to an existing Feedback, or is it better to create a completely brand new Feedback specific to the new iOS release?
Hoping to see this fixed before iOS 17 ships.
(FB12145152 for any Apple engineers who might be curious…)
Updating PDX Transit (SwiftUI) for watchOS 10. Running it now for the first time and without any work, it already looks way better. Apple really did a good job with the new Navigation styling debuting in watchOS 10. Navigation, scrolling, and titles animate better than ever before! This is fun, fluid UI! :D
Based on the analysis in this blog post, #visionOS relies on semantic HTML rather than author-defined CSS hover styles in order to render things properly. All of a sudden, things that those of us #accessibility people have been saying about properly using the semantic web become important.
🤷🏾 I guess.
Based on the #visionOS Developer documentation, Apple is not supporting apps like:
"Movement-based apps. This includes apps that follow a person’s location changes, such as apps that offer turn-by-turn directions or navigation. It also includes apps that track body movements.
Selfie or photography apps. This includes apps where the primary purpose is to capture images or video from the device’s cameras."
For now, this will limit some#accessibility scenarios.
One thing I just noticed in the iOS 17 first beta is that, when you try to go and update your device, it gives you an estimate of the time it will take for the update to complete. For example, it told me that it would take 20 minutes for the update to be done. Also reminds yu to backup.
Happy to see that Apple's announcement for the availability of visionOS development tools promanently mentions #accessibility so that developers can star considering it as they look at developing for the #VisionPro device.
I was at an #a11y design lab at #WWDC23 and was told by two engineers that Study should post a ‘screenChanged’ notification every time I change a tab in the tab bar, effectively changing the focus of VoiceOver from the tab item to the navigation title.
But I tried several system apps and none of them do that. So I don’t know what to do with that feedback - Should I stick with the convention or do what I was told at the lab? 👀
Any suggestions? Did anyone of you ever hear of such a behaviour?