What’s @concurrent in Swift 6.2? – Donny Wals


Swift 6.2 is on the market and it comes with a number of enhancements to Swift Concurrency. Considered one of these options is the @concurrent declaration that we are able to apply to nonisolated capabilities. On this put up, you’ll be taught a bit extra about what @concurrent is, why it was added to the language, and when you need to be utilizing @concurrent.

Earlier than we dig into @concurrent itself, I’d like to supply a little bit little bit of context by exploring one other Swift 6.2 function known as nonisolated(nonsending) as a result of with out that, @concurrent wouldn’t exist in any respect.

And to make sense of nonisolated(nonsending) we’ll return to nonisolated capabilities.

Exploring nonisolated capabilities

A nonisolated operate is a operate that’s not remoted to any particular actor. For those who’re on Swift 6.1, otherwise you’re utilizing Swift 6.2 with default settings, that implies that a nonisolated operate will all the time run on the worldwide executor.

In additional sensible phrases, a nonisolated operate would run its work on a background thread.

For instance the next operate would run away from the principle actor always:

nonisolated 
func decode(_ information: Knowledge) async throws -> T {
  // ...
}

Whereas it’s a handy technique to run code on the worldwide executor, this conduct could be complicated. If we take away the async from that operate, it should all the time run on the callers actor:

nonisolated 
func decode(_ information: Knowledge) throws -> T {
  // ...
}

So if we name this model of decode(_:) from the principle actor, it should run on the principle actor.

Since that distinction in conduct could be sudden and complicated, the Swift crew has added nonisolated(nonsending). So let’s see what that does subsequent.

Exploring nonisolated(nonsending) capabilities

Any operate that’s marked as nonisolated(nonsending) will all the time run on the caller’s executor. This unifies conduct for async and non-async capabilities and could be utilized as follows:

nonisolated(nonsending) 
func decode(_ information: Knowledge) async throws -> T {
  // ...
}

Everytime you mark a operate like this, it not robotically offloads to the worldwide executor. As an alternative, it should run on the caller’s actor.

This doesn’t simply unify conduct for async and non-async capabilities, it additionally makes our code much less concurrent and simpler to cause about.

After we offload work to the worldwide executor, which means that we’re basically creating new isolation domains. The results of that’s that any state that’s handed to or accessed inside our operate is doubtlessly accessed concurrently if we’ve concurrent calls to that operate.

Which means that we should make the accessed or passed-in state Sendable, and that may turn into fairly a burden over time. For that cause, making capabilities nonisolated(nonsending) makes lots of sense. It runs the operate on the caller’s actor (if any) so if we go state from our call-site right into a nonisolated(nonsending) operate, that state doesn’t get handed into a brand new isolation context; we keep in the identical context we began out from. This implies much less concurrency, and fewer complexity in our code.

The advantages of nonisolated(nonsending) can actually add up which is why you may make it the default in your nonisolated operate by opting in to Swift 6.2’s NonIsolatedNonSendingByDefault function flag.

When your code is nonisolated(nonsending) by default, each operate that’s both explicitly or implicitly nonisolated shall be thought of nonisolated(nonsending). Which means that we’d like a brand new technique to offload work to the worldwide executor.

Enter @concurrent.

Offloading work with @concurrent in Swift 6.2

Now that you recognize a bit extra about nonisolated and nonisolated(nonsending), we are able to lastly perceive @concurrent.

Utilizing @concurrent makes most sense once you’re utilizing the NonIsolatedNonSendingByDefault function flag as nicely. With out that function flag, you may proceed utilizing nonisolated to realize the identical “offload to the worldwide executor” conduct. That mentioned, marking capabilities as @concurrent can future proof your code and make your intent specific.

With @concurrent we are able to be sure that a nonisolated operate runs on the worldwide executor:

@concurrent
func decode(_ information: Knowledge) async throws -> T {
  // ...
}

Marking a operate as @concurrent will robotically mark that operate as nonisolated so that you don’t have to put in writing @concurrent nonisolated. We are able to apply @concurrent to any operate that doesn’t have its isolation explicitly set. For instance, you may apply @concurrent to a operate that’s outlined on a major actor remoted sort:

@MainActor
class DataViewModel {
  @concurrent
  func decode(_ information: Knowledge) async throws -> T {
    // ...
  }
}

And even to a operate that’s outlined on an actor:

actor DataViewModel {
  @concurrent
  func decode(_ information: Knowledge) async throws -> T {
    // ...
  }
}

You’re not allowed to use @concurrent to capabilities which have their isolation outlined explicitly. Each examples beneath are incorrect for the reason that operate would have conflicting isolation settings.

@concurrent @MainActor
func decode(_ information: Knowledge) async throws -> T {
  // ...
}

@concurrent nonisolated(nonsending)
func decode(_ information: Knowledge) async throws -> T {
  // ...
}

Understanding when to make use of @concurrent

Utilizing @concurrent is an specific declaration to dump work to a background thread. Notice that doing so introduces a brand new isolation area and would require any state concerned to be Sendable. That’s not all the time a straightforward factor to tug off.

In most apps, you solely wish to introduce @concurrent when you’ve got an actual subject to resolve the place extra concurrency helps you.

An instance of a case the place @concurrent ought to not be utilized is the next:

class Networking {
  func loadData(from url: URL) async throws -> Knowledge {
    let (information, response) = attempt await URLSession.shared.information(from: url)
    return information
  }
}

The loadData operate makes a community name that it awaits with the await key phrase. That implies that whereas the community name is lively, we droop loadData. This enables the calling actor to carry out different work till loadData is resumed and information is on the market.

So after we name loadData from the principle actor, the principle actor can be free to deal with consumer enter whereas we watch for the community name to finish.

Now let’s think about that you simply’re fetching a considerable amount of information that it is advisable decode. You began off utilizing default code for the whole lot:

class Networking {
  func getFeed() async throws -> Feed {
    let information = attempt await loadData(from: Feed.endpoint)
    let feed: Feed = attempt await decode(information)
    return feed
  }

  func loadData(from url: URL) async throws -> Knowledge {
    let (information, response) = attempt await URLSession.shared.information(from: url)
    return information
  }

  func decode(_ information: Knowledge) async throws -> T {
    let decoder = JSONDecoder()
    return attempt decoder.decode(T.self, from: information)
  }
}

On this instance, all of our capabilities would run on the caller’s actor. For instance, the principle actor. After we discover that decode takes lots of time as a result of we fetched an entire bunch of information, we are able to resolve that our code would profit from some concurrency within the decoding division.

To do that, we are able to mark decode as @concurrent:

class Networking {
  // ...

  @concurrent
  func decode(_ information: Knowledge) async throws -> T {
    let decoder = JSONDecoder()
    return attempt decoder.decode(T.self, from: information)
  }
}

All of our different code will proceed behaving prefer it did earlier than by operating on the caller’s actor. Solely decode will run on the worldwide executor, guaranteeing we’re not blocking the principle actor throughout our JSON decoding.

We made the smallest unit of labor potential @concurrent to keep away from introducing a great deal of concurrency the place we don’t want it. Introducing concurrency with @concurrent is just not a nasty factor however we do wish to restrict the quantity of concurrency in our app. That’s as a result of concurrency comes with a fairly excessive complexity price, and fewer complexity in our code usually implies that we write code that’s much less buggy, and simpler to keep up in the long term.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles