MacKuba

Kuba Suder's blog on Mac & iOS development

Notes

App ClipsAppKitCloudKitExtensionsFoundationiCloudLocationLoggingMacMapsPerformancePhotosPrivacySafariSwiftSwiftUIUIKitWatchKitWWDC 12WWDC 14WWDC 15WWDC 16WWDC 18WWDC 19WWDC 20

Fix Bugs Faster Using Activity Tracing

Categories: Foundation, Logging, WWDC 14 0 comments Watch the video

⚠️ The APIs mentioned in this talk seem to be somewhat abandoned – they were never made available to Swift, some things have been deprecated or have stopped working.


Apple has been adding a lot of tools for asynchronous development, like XPC, GCD, NSOperation

However, splitting work between different threads and especially between different processes makes it difficult to debug and diagnose issues

A single call to a system API might result in multiple system daemons calling each other to execute this single activity

Current logging mechanisms are insufficient, because they lack context showing how you got to the place where something happens

When something goes wrong in one of the systems involved, the whole activity fails, but it’s not clear where to look for the problem

Goals of this new tool:

  • reduce the time guessing what happened where, how did it get there and why
  • understand interaction of multiple actions over time
  • it should be lightweight and easy to use

Activities

Activity = a set of work triggered by a specific action, going across multiple systems cooperating to realize a specific goal

There’s a new system daemon diagnosticd that handles the activity tracing for all processes

Each activity has an identifier (AID) automatically propagated across the system

Activities are automatically created by UIKit and AppKit for UI actions

When you e.g. press a button in the UI, AppKit or UIKit automatically creates a new activity with some unique name, calls os_activity_start() on it, calls your IBAction handler, and then calls os_activity_end() at the end

You can also create activities explicitly:

os_activity_initiate("Activity Name", flags, ^{ … });
  • name must be a constant string
  • another variant available without a block
  • included in <os/activity.h>

Detached activity (OS_ACTIVITY_FLAG_DETACHED) = a new activity that’s not a subactivity of the current one, but a completely independent activity

Breadcrumbs

Breadcrumb = a user-facing or user-initiated action that triggered an activity, e.g. “Send an email”

A way to label activities meaningful to you

Adding a breadcrumb:

os_activity_set_breadcrumb("composing email");
  • name must be a constant string
  • only supported in the main process binary, not in libraries or plugins

⚠️ Replaced later with os_activity_label_useraction, only for activities started from IBActions

Trace messages:

Trace message = a single log in a given system as a part of an activity

New API to add trace messages to the current activity

Messages are stored in an in-memory ring buffer

Different behavior for production code and debugging

os_trace("Something happened at level %d", index);
  • only scalar format types allowed (ints, longs etc.)
  • strings and characters are *not* supported for privacy, security and performance reasons
  • included in <os/trace.h>
os_debug_trace("Open socket %d total time %f", sock, time);
  • version for debug logging
  • only actually logged if you’re in the debug mode, ignored in release mode
  • records more information – increased buffer size
  • you can enable debug mode at launch by setting env variable OS_ACTIVITY_MODE=debug

If you want to pass some additional data with a trace message, you can use os_trace_with_payload:

os_trace_with_payload("Interface: %ld", index, ^(xpc_object_t dict) {
    xpc_dictionary_set_string(dict, "interface_name", ifname);
});
  • blocks are only called and have their data recorded if the activity logs are being currently live streamed to a log viewing tool
  • uses XPC to deliver data to diagnosticd
  • you can enable stream mode with an env variable OS_ACTIVITY_MODE=stream

⚠️ Replaced later with os_log

Crash logs now include information about the current activity at the moment of the crash (name, running time etc.), the last few breadcrumbs, and recent trace messages from that activity (!)

⚠️ They don’t anymore :(

Logging errors into trace messages:

os_trace_error("Interface %ld failed with error %d", ifindex, errno);
  • soft errors, something that went wrong
os_trace_fault("Invalid state %d - will likely crash", state);
  • fatal errors, this shouldn’t happen, about to crash
  • sends trace messages from the process’s local buffer to diagnosticd

Limits:

  • format string cannot exceed 100 characters
  • formatted trace message string also has a limit, will truncate if exceeded
  • up to 7 parameters allowed
  • process ring buffer size depends on the platform and debug/release mode

Debugger: thread info includes current activity info and last trace messages

ostraceutil command line tool – live streaming of activity trace from a process (by name or pid)

⚠️ Replaced later with log

Tip: think about privacy when adding trace messages – never trace identifying information about a user or device

Unified Logging and Activity Tracing

Categories: Foundation, Logging, WWDC 16 0 comments Watch the video

Design of the API:

Creating one common efficient logging mechanism that can be used in user and kernel mode

Maximize information collected while minimizing the “observer effect” (affecting the program and the analyzed issue by adding logging)

Minimal footprint during the call – a lot of processing is deferred to some later time, e.g. to the moment when the information is displayed when viewing logs

Managing message lifecycle – different kinds of messages are kept around for a different duration depending on how important they are

We want as much logging happening as possible all the time without it affecting the performance of the system

Designed for privacy from the ground up

New ways of categorising and filtering log messages, so it’s easier to find messages you care about

Caller information is collected automatically, so no need to add info about the file/line etc. to the message

Built-in type specifiers that help with message formatting

New console app and command line tool

Supported across all platforms and simulators

Legacy APIs (NSLog, asl_log_message, syslog) are all redirected to this new system (if you build on the latest SDK)

All log data will be in a new location and a new format

Log data is kept in a compressed binary format, as .tracev3 files

Stored in /var/db/diagnostics and /var/db/uuidtext

You must use new tools to search the logs, since the data is binary and not searchable using grep etc.

New .logarchive format for sharing logs exported from the system log

Subsystems and categories:

Log messages can now be associated with a subsystem (e.g. an app, target or module) and a category (e.g. a section of an app), which can be used to control how messages are filtered and displayed

A subsystem can have multiple categories, and you can use as many subsystems and categories as you need

Each log message has a level:

  • 3 basic levels: Default, Info, Debug
  • 2 special levels: Fault, Error

You can set per category, per subsystem or globally logs of which level up are enabled (Default and up are always enabled) and which are stored to disk or to memory (memory log keeps a temporary space for logs that are overwritten much faster than disk logs)

Behavior can be customized globally using profiles or the log command on macOS

Default configuration is:

  • Logs of level Default and above are saved to disk
  • Info logs are enabled and stored in memory
  • Debug logs are disabled

Privacy:

The new logging system is designed to prevent accidentally logging sensitive personal information to the logs

Dynamic data injected into log messages is assumed to be private, static strings are not private

Errors and faults:

The system collects some additional information when logging errors and faults to help with investigating issues

Error  ⭢  an issue discovered within the given application/library

On an error, all previous logs from this process logged to memory are saved to disk so that they can be kept for longer

Fault  ⭢  a more global problem in the system

On a fault, all logs from this process and other processes involved in the same activity are saved to disk, plus some other system information is collected

Faults and errors and the information collected with them are saved to separate log files, so the normal logs don’t push them out

How the system works:

Within each process there are some buffers for logging messages

There is a logging daemon logd in the system, when the buffers fill up it compresses them into a larger buffer that it maintains

If you request a live stream of logs, the logs are sent to diagnosticd daemon which sends them to the client

There is a large performance hit when doing live streaming of logs, since all the optimizations can’t be used

The Console app:

Sidebar shows different kinds of log sources and connected devices

“Activities” button lets you turn on activities mode that shows a path of logs connected to a specific activity

When streaming live, by default Debug and Info messages are not printed – enable them in the menu to see them

Colored dots on the left side show message level (no dot = default level)

Use left/right arrows to expand and collapse messages in the list

To filter messages, you can search for a string in any field, or you can ask to exclude rows from a given process/subsystem/category etc.

Use the context menu to show/hide messages of the same category/process etc. as the highlighted one

Type e.g. “proc:myapp” to quickly search for process = myapp

You can save frequently used searches using the “Save” button, they’re added to the filter bar above the table

Activities can also be filtered in a similar way (however, they don’t have subsystems, categories or levels)

New APIs for logging:

os_log – writes a log of default level

os_log_info, os_log_debug, os_log_error, os_log_fault – different levels

os_log_create – creates a log object that you can customize

os_log_t log = os_log_create("com.yourcompany.subsystem", "network");

Then you pass this log object to the calls listed above:

os_log(log, "Something happened")

You can have multiple log objects for different categories and use the right one for each log

If you don’t want to have a custom subsystem/category, pass OS_LOG_DEFAULT as the log object

Log API includes built-in formatters for various types

Converting a value into a string representation is deferred to when the message is displayed in the Console

Timestamps: %{time_t}d

Error codes: %{errno}d

Arbitrary binary data: %.*P

etc.

Privacy:

Privacy is handled on a per parameter basis

Static strings are assumed to be public, dynamic strings and objects are assumed to be private unless you override it

Overriding an object to be public: %{public}@

Overriding a simple value to be private: %{private}d

You can combine privacy and formatting: %{public, uuid_t}.16P

Activities:

Activities are now objects that you can store and reuse

You can easily control the relationships between activities

os_activity_create – creates an activity object (you can pass it a parent activity to create a hierarchy)

os_activity_label_useraction – marks the activity as a “user action” (initiated by the user)

os_activity_apply(activity, ^{ … }) – executes that block as a part of that activity

os_activity_scope(activity) – executes everything until the closing brace as a part of that activity

The log command line tool:

Same functionality as the Console, but from the command line

log stream – live streaming logs

log stream --predicate 'eventMessage contains "my message"'

log show system_logs.logarchive – shows logs from a file

log config --mode "level:debug" --subsystem com.mycorp.app – enable debug logging for the given subsystem

Logging tips:

Don’t add any extra whitespace or formatting to logs

Let the SDK do string formatting

Avoid wrapping os_log calls in some custom APIs (or use macros, not functions)

Only log what’s needed from arrays and dictionaries, they can take a lot of space on disk

Avoid logging in tight code loops

How to use various levels:

  • default – for normal logs about what the app is doing
  • info – for additional info that gets stale quickly and will only matter in case of an error
  • debug – for high volume logging during development
  • error – to capture additional information from the app
  • fault – to capture additional information from the system

Collecting logs for bug reports:

The preferred method is sysdiagnose

Sysdiagnose collects logs in a file named system_logs.logarchive

You can use a specific key combination on the given device to trigger a sysdiagnose, and then transfer using iTunes

Deprecations:

All ASL logging APIs are deprecated

Older activity APIs: os_activity_start, os_activity_end, os_activity_set_breadcrumb, os_trace_with_payload

What's new in SwiftUI

Categories: SwiftUI, WWDC 20 0 comments Watch the video

App & scene definition:

The whole app can now be built with SwiftUI, including what was previously in app delegate:

@main
struct HelloWorld: App {
    var body: some Scene {
        WindowGroup {
            ...
        }
    }
}

A WindowGroup renders as a single window on iPhone and Watch, on iPad it supports the new multi-window scene system, and on the Mac it creates multiple Mac windows

Settings – creates Mac preferences windows:

#if os(macOS)
    Settings {
        PreferencesList()
    }
#endif

For document-based apps:

@main
struct ShapeEditApp: App {
    var body: some Scene {
        DocumentGroup(newDocument: SketchDocument()) { file in
            DocumentView(file.$document)
        }
    }
}

Defining menu commands:

.commands {
    CommandMenu("Shape") {
        Button("Add Shape", action: { ... })
            .keyboardShortcut("N")
        ...
    }
}

New multiplatform Xcode project template that creates one SwiftUI app for all platforms

New “Launch Screen” Info.plist key

Lets you define launch screen elements like background, navigation bar, tab bar

Replacement for the launch storyboard for SwiftUI apps that don't otherwise have storyboards

New API for widgets:

@main
struct RecommendedAlbum: Widget {
    var body: some WidgetConfiguration {
        ...
    }
}

Complications for Apple Watch can now also be built using SwiftUI

Outlines (hierarchical lists):

List(graphics, children: \.children) { … }

Grids:

LazyVGrid(columns: [GridItem(.adaptive(minimum: 176))]) { … }
-> adapts number of columns to view size

LazyVGrid(columns: Array(repeating: GridItem(), count: 4)) { }
-> fixed number of columns

Also horizontally loading grids

Lazy versions of the existing stack views: LazyVStack, LazyHStack

Toolbars:

.toolbar {
    Button(action: { ... }) { Label("Record", systemImage: ...)
    // placed automatically in an appropriate place

    ToolbarItem(placement: .primaryAction) {
        Button(...)
    }
}

ToolbarItem has various semantic names for placement like principal, confirmationAction, bottomBar

Labels:

Label(“Books”, systemImage: “book.circle”)
Label { Text(“…”) } icon: { Image(systemName: “…”) }
  • combines an icon with a title and shows them appropriately for the context

.help(“Does something”) – tooltips on macOS, voiceover comment on iOS

.keyboardShortcut(“X”)

.keyboardShortcut(.cancelAction / .defaultAction)

New support for focus on tvOS

ProgressView – spinner or progress bar

Gauge – on watchOS, the circular meter control from complications

.matchedGeometryEffect(id: albumId, in: namespace) – animates a view from one view hierarchy into another

.clipShape(ContainerRelativeShape()) – applies rounded corners matching the parent view

Text – custom fonts are now adapted to Dynamic Type

Images can be embedded inside text and are scaled with it

@ScaledMetric var padding – a value defined by you that is scales with Dynamic Type

App accent color can now be set in the asset catalog

Support for accent color on the Mac

.listItemTint() customizes the color for a section or item

Tint parameter in control styles: .toggleStyle(SwitchToggleStyle(tint: .accentColor))

Link(destination: url) { Label(…) } – a link label that opens a URL in Safari

@Environment(\.openURL) var openURL – provides a openURL(url) function

UniformTypeIdentifiers framework provides content type identifiers for drag & drop

SignInWithAppleButton()

SwiftUI views for: AVKit, MapKit, SceneKit, SpriteKit, QuickLook, HomeKit, ClockKit, StoreKit, WatchKit

Stacks, Grids, and Outlines in SwiftUI

Categories: SwiftUI, WWDC 20 0 comments Watch the video

Stacks

New lazy stack views: LazyVStack and LazyHStack

They work just like the existing horizontal and vertical stacks, but they render their content incrementally as it becomes visible

Useful for very long lists built using VStack/HStack that had performance issues previously

There’s no point using lazy stacks for small local stacks that just lay out single views that are all visible on the screen together

If you’re unsure, use standard HStack/VStack by default, and only use lazy stacks for long content that’s scrolling and becomes a performance bottleneck

Grids

New grid views: LazyHGrid and LazyVGrid

They lay out subviews in a grid like collection view

LazyVGrid(columns: columns, spacing: 0) { … }

Grids require a definition of columns:

Constant number of columns:

var columns = [
  GridItem(spacing: 0),
  GridItem(spacing: 0),
  GridItem(spacing: 0)
]

Adaptive columns, number depends on the window size:

var columns = [
  GridItem(.adaptive(minimum: 300), spacing: 0)
]

Lists and outlines

Lists are more than just stacks of content – they support selection and scrolling etc.

Lists don’t need a Lazy variant, because list contents are always loaded lazily

Lists can now show hierarchical trees of items (outlines)

To make an outline list, provide a “children” keypath:

List(graphics, children: \.children) { graphic in
    GraphicRow(graphic)
}

To make collapsible list sections, use an OutlineGroup:

ForEach(canvases) { canvas in
    Section(header: Text(canvas.name)) {
        OutlineGroup(items, children: \.children) { graphic in
            GraphicRow(graphic)
        }
    }
}

You can also implement other views with parts that the user can collapse and expand like in an outline using DisclosureGroup:

DisclosureGroup(isExpanded: $expanded) {
  Contents()
} label: {
  Label(“Name”)
}

or:

DisclosureGroup(“Label”) { contents }

Build document-based apps in SwiftUI

Categories: SwiftUI, WWDC 20 0 comments Watch the video

For document based apps, use the DocumentGroup scene as the main scene of the app:

DocumentGroup(newDocument: MyDocument()) { file in
  ContentView(document: file.$document)
}

The content view receives a binding to the document contents, so when it changes the contents, the system knows it was modified

Document based apps have a “Document Types” section in the Info.plist, where you declare Uniform Type Identifiers of document types associated with your app

Imported types  ⭢  documents from other apps or sources that your app is able to open

Exported types  ⭢  document types owned by your app

TextEditor() – built in text view type

The type that implements the document model conforms to FileDocument (for value types) and declares its own UTType instances that represent imported and exported file types:

import UniformTypeIdentifiers

extension UTType {
    static var exampleText: UTType {
        UTType(importedAs: "com.example.plain-text")
    }

    static let shapeEditDocument =
        UTType(exportedAs: "com.example.ShapeEdit.shapes")
}

Imported type needs to be a computed property, because the value returned from the constructor may change between calls while the app is running, depending on system configuration. Exported type can just be assigned once and stored.

The document type provides a list of types (own and generic) that it can accept:

static var readableContentTypes: [UTType] { [.exampleText] }

It also has methods for reading and writing its document to/from a file, which you need to implement, using e.g. Codable to encode/decode the value into your chosen format:

init(fileWrapper: FileWrapper, contentType: UTType) throws { ... }

func write(to fileWrapper: inout FileWrapper, contentType: UTType) throws { ... }

In those methods, you can assume that the content type is one of those you declared as accepted by your app.

What's new in Swift

Categories: Swift, WWDC 20 0 comments Watch the video

Improvements to binary size, especially in SwiftUI apps

Smaller memory usage overhead from things like internal caches of the Swift runtime (use iOS 14 deployment target for best effect)

Apple can now use Swift in very low level frameworks, where previously only C was available

Much better compiler diagnostics (warnings and errors)

New diagnostics subsystem that helps identifying code issues much more precisely

More actionable errors with guidance how to fix them

Improved code completion

Code completion better understands features such as ternary operator or keypaths

Drastically improved performance, up to 15x faster

Improved code indentation in things like: chained calls, call arguments, multi-line control flow statements

Better error messages for runtime failures like arithmetic overflow

Debugging improvements: improved module importing which should lead to less issues while evaluating variables in lldb

Cross-platform support:

Improved support for various distributions of Linux

Coming soon: initial support for Windows

Swift AWS Lambda runtime

New language features:

Multiple trailing closures

(API design tip: design names for methods with closure arguments so that the call site still makes sense if a trailing closure w/o an argument label is used)

Keypaths can be used as functions, so you can pass them to APIs that expect a function, like xx.map(\.field) (this is from Xcode 11.4)

@main – marks the entry point to the program

As a library author, you can declare a static function main() in a class or type

Users can then use an instance of that type with the tag @main, and compiler implicitly creates a main.swift file that calls that function

class Foo {
  public static func main() {
    …
  }
}
import FooKit

@main
class Hello: Foo {
  …
}

Replaces the old @UIApplicationDelegate which could only be used in the specific context of UIApplicationDelegate

Increased availability of implicit self in closures:

  • you can skip self. if you capture [self] in the closure arguments
  • you can skip self. if self is a struct or enum (like in SwiftUI)

Multi-clause catch statements in do-catch

Automatic comparable conformance for (some) enum types

Enums can fulfill protocol requirements requiring static properties (plain case) and static functions (case with arguments)

View builders improvements:

  • support for more kinds of statements, e.g. switch and if-let
  • adding a view builder annotation to body declaration if it has more than one children is not necessary anymore

Float16 – a float type that uses 2 bytes (half of standard float)

Apple Archive – new archive file format, .aar extension

Fast, uses multithreading

Includes a command-line tool and Finder integration

Swift API – AppleArchive framework

Swift System – Swift interfaces to system calls and other low level APIs

OSLog:

Improved performance

Support for string interpolations and formatting options:

logger.log("\(offerID, align: .left(columns: 10), privacy: .public)")
logger.log("\(seconds, format: .fixed(precision: 2)) seconds")

Packages available on GitHub:

Swift Numerics

Swift Argument Parser

Swift Standard Library Preview

iPad and iPhone apps on Apple Silicon Macs

Categories: Mac, UIKit, WWDC 20 0 comments Watch the video

iPhone and iPad apps can now run unmodified on macOS on ARM, the same exact binary as on iOS

Uses Catalyst APIs, so apps get a lot of behavior for free

Making a separate Catalyst build manually is still recommended for optimal result (and supports Intel Macs)

To run on the Mac, an iOS app must not:

  • depend on missing symbols, frameworks or other functionality (Xcode should help you at least in some cases)
  • depend on hardware which Macs don’t have

Compatible apps are automatically made available for Macs unless you opt out

Possible compatibility issues:

Hardware:

  • Macs use mouse & keyboard and not touch screens, so if an app uses touch in non-standard ways or relies on multi-touch, this might not work
  • apps that depend on sensors like accelerometer, gyroscope, compass, depth camera, precise GPS will not work
  • iOS apps expect one front and one back camera, Macs may have one, none or multiple (use AVCaptureDeviceDiscoverySession)

UI:

  • alerts and panels like open/save will look differently than on iOS, so don't assume their dimensions and position
  • if an iPad app supports multitasking, it will be fully resizable to any dimensions, including some layouts and sizes you might not expect
  • iPhone-only apps and iPad apps that don’t support multitasking will run in a fixed size window matching iOS device size
  • window resizing is live on macOS – make sure it’s optimized

System software:

  • on the Mac, user can move the app bundle to different locations, application container is also in a different location
  • APIs returning device type or model will include Mac model types
  • screen size will not be one of the standard iOS device screen sizes that you may expect

You can debug, profile and test iOS on an ARM Mac as you would expect

Distribution for Mac devices works the same as for iOS devices

Apps can reuse existing subscriptions and in-app purchases

App thinning automatically selects most appropriate resources for the given Mac

→ when making a thinned build, you can select the option to optimize for any Mac

TestFlight is (still) not available on the Mac

Building Custom Views with SwiftUI

Categories: SwiftUI, WWDC 19 0 comments Watch the video

Layout for a simple view with just text:

  • text sizes to fit the contents
  • the view's content view contains everything inside, i.e. the text, and also has the size needed to fit all the subviews with their preferred size
  • the root view of the screen takes the whole frame of the screen and centers the small content view inside itself

By default the root view does not fill the safe area at the top – to make it really fill the whole screen, use .edgesIgnoringSafeArea(.all)

The layout process in general goes like this:

  1. 1. The parent proposes a size to the child (the maximum area that it can offer)
  2. 2. The child chooses its own preferred size
  3. 3. Parent places the child inside its coordinate space, by default in the middle

This means that subviews have their own sizing behaviors, the child decides what size it needs for itself

If a view uses a modifer like .aspectRatio(1) or .frame(width: 50, height 10), the parent has to obey this

Coordinates are always rounded to the nearest pixel, so there are crisp lines and no antialiasing

.background() inserts a view wrapping directly the view it’s called on, and it always has the same bounds as that view – so it can be useful for debugging to see the actual sizes of each view

A background view is “layout neutral”, so it doesn’t affect the layout at all

.padding() adds padding around the view – if not specified, SwiftUI chooses the default amount of padding appropriate for the given element, platform and environment; it offers its subview slightly smaller area than it was offered, inset by the specified padding

A Color view fills whatever space it’s given

Images:

Images are by default fixed size (equal to the image dimensions), unless you mark them as resizable – so just applying .frame(…) to an image won’t change its size, it just wraps it in a larger frame (empty on the sides)

A .frame() is *not* a constraint like in AutoLayout – it’s just a wrapping view that proposes a specified size to its child – which the child can take into account or not, depending on the type of view

“There is no such thing as an incorrect layout… unless you don’t like the result you’re getting” :D

SwiftUI automatically applies spacing between elements depending on their kind

You can override the spacings, but the defaults should usually be the right values

When you view the layout in a right-to-left language, SwiftUI automatically switches all subviews in the correct direction

Stacks:

A stack takes the space it was given from the parent, deducts the spacing and divides it equally into children, then proposes that space to children starting with the least flexible ones (e.g. a fixed size image)

If they don’t take all available space, they’re aligned inside the stack according to specified alignment, and then the stack sets its size to enclose all the children

To define which children in a stack take more available space if there isn’t enough, use .layoutPriority(x) to specify their priority (default is 0)

Vertical alignment:

Don’t align labels in an HStack to .bottom – align them to the baseline instead (.lastTextBaseline)

If there are images too, by default the image’s baseline is the bottom edge, but you can override it this way:

.alignmentGuide(.lastTextBaseline) { d in d[.bottom] * 0.927 }

Aligning views in separate branches of the view tree:

To align views that are in separate stacks to each other, you need to specify a custom named alignment guide:

extension VerticalAlignment {
  private enum MidStarAndTitle: AlignmentID {
    static func defaultValue(in d: ViewDimensions) -> Length {
      return d[.bottom]  // not important
    }
  }

  static let midStarAndTitle = VerticalAlignment(MidStarAndTitle.self)
}

Now, any view with an added modifier .alignmentGuide(.midStarAndTitle) { … } will be aligned to the same line

In the block, specify how to calculate the position of the guide in the view:

.alignmentGuide(.midStarAndTitle) { d in d[.bottom] / 2 }

Drawing graphics:

Graphics in SwiftUI is done with special kinds of views representing shapes, but they’re views just like the buttons and labels, so everything about buttons and labels applies to drawing, and all the effects done for drawing can also be applied to controls

Circle(), Capsule(), Ellipse()
Circle().fill(Color.red)
Capsule().stroke(red, lineWidth: 20)
Ellipse().strokeBorder(red, style: …)
Gradient(colors: [.red, .yellow, …])
AngularGradient(gradient: spectrum, center: .center, angle: .degrees(-90))

Views like Color or Gradient act as independent views themselves, or can be used as a fill for shapes:

Circle().fill(angularGradient)
Circle().strokeBorder(gradient, lineWidth: 50)

Custom shapes:

To define a custom shape, build a struct conforming to the Shape protocol that defines a path:

struct WedgeShape: Shape {
  var wedge: Ring.Wedge

  func path(in rect: CGRect) -> Path {
    var p = Path()
    p.addArc(…)
    p.addLine(…)
    p.closeSubpath()
    return p
  }
}

Custom transition:

struct ScaleAndFade: ViewModifier {
  var isActive: Bool

  func body(content: Content) -> some view {
    return content
      .scaleEffect(isActive ? 0.1 : 1)
      .opacity(isActive ? 0 : 1)
  }
}

let scaleAndFade = AnyTransition.modifier(
  active: ScaleAndFade(isActive: true),
  identity: ScaleAndFade(isActive: false)
)

.transition(scaleAndFade)

When drawing with a lot of elements, mark all shapes inside a container as a “drawing group” (.drawingGroup()) to give a hint to the rendering engine to flatten them all into one native view or layer and render the contents using Metal