WWDC 21
Add rich graphics to your SwiftUI app
Safe area
By default SwiftUI positions your content within the safe area, so that it doesn't get obscured by things at the top and the bottom of the screen, like the navigation bar and the home bar on Face ID phones
If you do want the view to extend to the whole screen, edge to edge, below the top/bottom bars (e.g. on screens that show some kind of graphics), you can opt out of the safe area using .ignoresSafeArea
:
ContentView() .ignoresSafeArea()
You can also choose on which sides you want to extend the view into the safe area:
ContentView() .ignoresSafeArea(edges: .bottom)
You don't need to do this on most screens, since most content should be kept within the safe area
The keyboard, if visible, is also outside the safe area β it actually has its own inner safe area that's positioned inside the main "container safe area"
By default your view is positioned within the keyboard's safe area, so that it doesn't get covered by the keyboard β to opt out just of the keyboard safe area (but stay within container safe area), add the additional .keyboard
parameter:
ContentView() .ignoresSafeArea(.keyboard)
Backgrounds
There are now new versions of the .background
modifier where if you don't pass any specific color or style, it uses the default screen background (white or black, depending on light/dark mode):
HStack { ... } .background()
These backgrounds (and backgrounds like .background(Color.green)
now automatically extend below the safe area, unless you customize it
You can also specify that the background should only be used within a defined shape around the view, like a rounded rectangle:
HStack { ... } .background(.cyan, in: RoundedRectangle(cornerRadius: 12))
In this case, the view and its entire background area both stay within the safe area
Materials and vibrancy
Materials are a new kind of backgrounds β the kind of translucent blur background used in various system areas:
.background(.regularMaterial) .background(.thickMaterial, in: RoundedRectangle(cornerRadius: 12))
Materials are great if you want to show some of the colorful content positioned below some kind of panel bleeding into its background
There's a set of materials: .ultraThin
, .thin
, .regular
, .thick
, .ultraThick
β the "thinner" the material, the more of the color from behind comes through
Some of the content displayed over a material background (using foreground colors of "secondary" and below) uses an effect called "vibrancy" β it blends the color of the text with the material background behind it in a specific way that makes it more readable
To apply the vibrancy effect, use the new .foregroundStyle
modifier instead of .foregroundColor
:
HStack { Text("\(stops.count) colors") .foregroundStyle(.secondary) } .background(.regularMaterial)
You can use .foregroundStyle
with hierarchical styles like .secondary
and with actual colors (or even gradients), and you can even use both together, in which case SwiftUI applies some alpha effect to the selected color:
VStack { Text("Primary").foregroundStyle(.primary) Text("Secondary").foregroundStyle(.secondary) Text("Tertiary").foregroundStyle(.tertiary) Text("Quaternary").foregroundStyle(.quaternary) } .foregroundStyle(.purple)
Use .foregroundColor
if you want to opt out of vibrancy for a specific view and use an unmodified color
What's more, a Text
can even present an attributed string that has multiple different colors applied to different ranges and use a vibrancy effect for the parts that don't have a specific color set (it can only have one single foregroundStyle
though)
You can embed one Text
inside another to construct a Text
with multiple colors in different ranges:
Text("\(stops.count) \(Text("colors").foregroundColor(.red))") .foregroundStyle(.secondary)
It's ok to use .foregroundStyle
on non-material backgrounds β the vibrancy effect is automatically disabled unless there is an appropriate background behind the text
Text views also automatically disable vibrancy for any embedded emoji
You can read more about vibrancy and materials in my old blog post about the introduction of dark mode in macOS Mojave β "Dark Side of the Mac: Appearance & Materials"
Safe area inset
There is a new view modifier .safeAreaInset
that allows you to position a view on top of a ScrollView
in a way that adjusts the scroll view's top/bottom content insets so that all the content can be reached and the beginning/end isn't obscured by the overlay
This is especially useful for views using the blur material background, since you want them to be z-stacked with some content on a layer below to show a hint of the colors behind it, but you want all the content in the view below to be accessible by scrolling it above the overlay
To position a view this way, put the whole overlay view inside the .safeAreaInset
closure:
List { ... } .safeAreaInset(edge: .bottom) { HStack { Text("New gradient") Spacer() Text("\(stops.count) colors") .foregroundStyle(.secondary) } .padding() .background(.thinMaterial) }
Canvas
When drawing a large number of graphical elements in a shared area, we can use a ZStack
and the .drawingGroup
modifier:
var body: some View { GeometryReader { proxy in ZStack { ForEach(state.entries.indices, id: \.self) { index in let entry = state.entries[index] entry.symbol .scaleEffect(...) .position(...) .onTapGesture { withAnimation { ... } } } } } .drawingGroup() }
A .drawingGroup
tells SwiftUI to combine all the views it contains on a single layer to improve performance
This can be used for graphical elements like images, but not for UI controls like text fields or lists
Using .drawingGroup
still allows you to do things like setting accessibility properties or applying gestures to each element individually
However, keeping each view's separate identity adds a bit of overhead
When you draw a really large amount of elements and you need all the performance you can get, you can now use the new Canvas
element
β the tradeoff is that you can't attach gestures to individual drawings anymore
A Canvas
is given a closure that's run every time the canvas is redrawn
This is a normal, imperative code closure, not a view builder
It works similarly to drawRect
in AppKit or UIKit β the canvas gives you a context object that you can run some draw commands on:
Canvas { context, size in let image = Image(systemName: "sparkle") for i in 0..<10 { context.draw(image, at: CGPoint( x: 0.5 * size.width + Double(i) * 10, y: 0.5 * size.height )) } }
When drawing an image, the context needs to "resolve" it to get the actual data it can draw, so if you use the same image multiple times in the canvas, you can resolve it manually up front just once
This also lets you access information like the image size:
Canvas { context, size in let image = context.resolve(Image(systemName: "sparkle")) let imageSize = image.size for i in 0..<10 { context.draw(image, at: CGPoint( x: 0.5 * size.width + Double(i) * imageSize.width, y: 0.5 * size.height )) } }
To draw shapes in the canvas, use context.fill()
You can draw e.g. bezier curves or paths derived from standard SwiftUI shapes
You can also use standard SwiftUI color objects:
Canvas { context, size in let image = context.resolve(Image(systemName: "sparkle")) let imageSize = image.size for i in 0..<10 { let frame = CGRect( x: 0.5 * size.width + Double(i) * imageSize.width, y: 0.5 * size.height, width: imageSize.width, height: imageSize.height ) context.fill( Ellipse().path(in: frame), with: .color(.cyan) ) context.draw(image, in: frame) } }
To change some properties globally on the context and revert them later, you can create a copy of the context on which you modify the settings, draw some elements using the new context, and the original context will not be affected:
Canvas { context, size in let image = context.resolve(Image(systemName: "sparkle")) let imageSize = image.size for i in 0..<10 { let frame = ... var innerContext = context innerContext.opacity = 0.5 innerContext.fill(Ellipse().path(in: frame), with: .color(.cyan)) // drawn with the original opacity context.draw(image, in: frame) } }
You can set the color with which a symbol is drawn by setting the shading
on the resolved image:
let image = context.resolve(Image(systemName: "sparkle")) image.shading = .color(.blue)
The Canvas
is supported on all platforms β it works the same way on iOS, watchOS, tvOS and macOS
Timeline view
Timeline view is a new tool that allows you to create animating views by describing exactly how the view should look at a given point in time
The timeline view requires a "schedule" β a description of how often it should change
For drawing an animated canvas, we'll use the .animation
schedule, which redraws the view as often as possible
TimelineView
gives you a timeline object from which you can read the current time using the date
property:
TimelineView(.animation) { timeline in Canvas { context, size in let now = timeline.date.timeIntervalSinceReferenceDate let angle = Angle.degrees(now.remainder(dividingBy: 3) * 120) let x = cos(angle.radians) var image = context.resolve(Image(systemName: "sparkle")) let imageSize = image.size for i in 0..<count { let frame = CGRect( x: 0.5 * size.width + Double(i) * imageSize.width * x, y: 0.5 * size.height, width: imageSize.width, height: imageSize.height ) context.draw(image, in: frame) } } }
Since there aren't any individual SwiftUI elements inside the Canvas
, you can't specify accessibility information for each element, only for the whole canvas
If you want to include a more detailed description with separate elements, you can use the new .accessibilityChildren
modifier to provide a completely separate SwiftUI view hierarchy used only for accessibility purposes
See more in "SwiftUI accessibility: Beyond the basics"