快速入门 iOS

在本教程中,我们将学习如何在 iOS 设备上使用 Flower 和 CoreML 在 MNIST 上训练神经网络。

First of all, for running the Flower Python server, it is recommended to create a virtual environment and run everything within a virtualenv. For the Flower client implementation in iOS, it is recommended to use Xcode as our IDE.

我们的示例包括一个 Python 服务器*和两个 iPhone *客户端,它们都具有相同的模型。

客户端*负责根据其本地数据集为模型生成独立的模型参数。然后,这些参数更新会被发送到*服务器,由*服务器*汇总后生成一个更好的模型。最后,服务器*将改进后的模型发送回每个*客户端。一个完整的参数更新周期称为一*轮*。

现在我们已经有了一个大致的概念,让我们开始设置 Flower 服务器环境吧。首先,我们需要安装 Flower。你可以使用 pip 来安装:

$ pip install flwr

或者Poetry:

$ poetry add flwr

Flower 客户端

现在我们已经安装了所有依赖项,让我们使用 CoreML 作为本地训练框架和 MNIST 作为数据集,运行一个简单的分布式训练。为了简单起见,我们将使用 CoreML 的完整 Flower 客户端,该客户端已在 Swift SDK 中实现并存储。客户端实现如下:

/// Parses the parameters from the local model and returns them as GetParametersRes struct
///
/// - Returns: Parameters from the local model
public func getParameters() -> GetParametersRes {
  let parameters = parameters.weightsToParameters()
  let status = Status(code: .ok, message: String())

  return GetParametersRes(parameters: parameters, status: status)
}

/// Calls the routine to fit the local model
///
/// - Returns: The result from the local training, e.g., updated parameters
public func fit(ins: FitIns) -> FitRes {
  let status = Status(code: .ok, message: String())
  let result = runMLTask(configuration: parameters.parametersToWeights(parameters: ins.parameters), task: .train)
  let parameters = parameters.weightsToParameters()

  return FitRes(parameters: parameters, numExamples: result.numSamples, status: status)
  }

/// Calls the routine to evaluate the local model
///
/// - Returns: The result from the evaluation, e.g., loss
public func evaluate(ins: EvaluateIns) -> EvaluateRes {
  let status = Status(code: .ok, message: String())
  let result = runMLTask(configuration: parameters.parametersToWeights(parameters: ins.parameters), task: .test)

  return EvaluateRes(loss: Float(result.loss), numExamples: result.numSamples, status: status)
}

Let's create a new application project in Xcode and add flwr as a dependency in your project. For our application, we will store the logic of our app in FLiOSModel.swift and the UI elements in ContentView.swift. We will focus more on FLiOSModel.swift in this quickstart. Please refer to the full code example to learn more about the app.

Import Flower and CoreML related packages in FLiOSModel.swift:

import Foundation
import CoreML
import flwr

Then add the mlmodel to the project simply by drag-and-drop, the mlmodel will be bundled inside the application during deployment to your iOS device. We need to pass the url to access mlmodel and run CoreML machine learning processes, it can be retrieved by calling the function Bundle.main.url. For the MNIST dataset, we need to preprocess it into MLBatchProvider object. The preprocessing is done inside DataLoader.swift.

// prepare train dataset
let trainBatchProvider = DataLoader.trainBatchProvider() { _ in }

// prepare test dataset
let testBatchProvider = DataLoader.testBatchProvider() { _ in }

// load them together
let dataLoader = MLDataLoader(trainBatchProvider: trainBatchProvider,
                              testBatchProvider: testBatchProvider)

Since CoreML does not allow the model parameters to be seen before training, and accessing the model parameters during or after the training can only be done by specifying the layer name, we need to know this information beforehand, through looking at the model specification, which are written as proto files. The implementation can be seen in MLModelInspect.

After we have all of the necessary information, let's create our Flower client.

let compiledModelUrl = try MLModel.compileModel(at: url)

// inspect the model to be able to access the model parameters
// to access the model we need to know the layer name
// since the model parameters are stored as key value pairs
let modelInspect = try MLModelInspect(serializedData: Data(contentsOf: url))
let layerWrappers = modelInspect.getLayerWrappers()
self.mlFlwrClient = MLFlwrClient(layerWrappers: layerWrappers,
                                 dataLoader: dataLoader,
                                 compiledModelUrl: compiledModelUrl)

Then start the Flower gRPC client and start communicating to the server by passing our Flower client to the function startFlwrGRPC.

self.flwrGRPC = FlwrGRPC(serverHost: hostname, serverPort: port)
self.flwrGRPC.startFlwrGRPC(client: self.mlFlwrClient)

That's it for the client. We only have to implement Client or call the provided MLFlwrClient and call startFlwrGRPC(). The attribute hostname and port tells the client which server to connect to. This can be done by entering the hostname and port in the application before clicking the start button to start the federated learning process.

Flower 服务器

For simple workloads we can start a Flower server and leave all the configuration possibilities at their default values. In a file named server.py, import Flower and start the server:

import flwr as fl

fl.server.start_server(config=fl.server.ServerConfig(num_rounds=3))

联邦训练模型!

客户端和服务器都已准备就绪,我们现在可以运行一切,看看联邦学习的实际效果。FL 系统通常有一个服务器和多个客户端。因此,我们必须先启动服务器:

$ python server.py

服务器运行后,我们就可以在不同的终端启动客户端。通过 Xcode 构建并运行客户端,一个通过 Xcode 模拟器,另一个通过部署到 iPhone。要了解更多有关如何将应用程序部署到 iPhone 或模拟器的信息,请访问 此处

Congratulations! You've successfully built and run your first federated learning system in your ios device. The full source code for this example can be found in examples/ios.