C2PA Guide
Sign and validate media within your mobile app
Lens' core features include a camera that automatically signs each capture with C2PA data. Instead of immediately uploading the unaltered image, some apps offer additional features such as filters and drawing annotations. However, any alteration to a signed image, even a single bit, will invalidate the hash and break the tamper-evident seal. To uphold the C2PA chain of custody, another signature should be added to the file to account for the changes, which includes a new hash of the latest version of the file. You can use LensSigner
to do this on any JPG, PNG, MP4, or M4A that contains a C2PA manifest already.
Lens also has the ability to sign media that you deem trustworthy on your behalf. Simply use the sign
function, outlined below, adding custom assertions if applicable, to sign a C2PA manifest into the media. Click here to see our supported formats and the minimum SDK version needed for each.
Target spec version
The C2PA spec has significant structural and validation differences between versions. While 1.4 is the current default, you can specify which version of the C2PA specification you would like to use at the time you configure a session. This target spec persists for the duration of the session. The available options are:
public enum TargetSpecOptions: Int {
case onePointThree = 13 // 1.3
case onePointFour = 14 // 1.4
case twoPointOne = 21 // 2.1
}
Important C2PA compatibility consideration
By default, images saved on iOS are stored in the Photos app. However, if iCloud is enabled, there is a limitation within Apple’s ecosystem (not related to our SDK) where media saved in the Photos app and synced with iCloud does not support C2PA signed content. This causes any C2PA signatures to be removed. This limitation does not affect users without iCloud enabled, where the Photos app preserves signed media metadata as expected.
As a workaround, iCloud users can store signed media in the Files app to avoid this issue. Additionally, developers offering download or sharing features may wish to inform users of this behavior within the app.
Handling signed images
While UIImage
is commonly used in iOS apps for image handling, it does not have built-in support for the metadata required to store C2PA manifests. To ensure files stay intact, we recommend representing image data captured through the Lens camera or imported signed images as Data
objects. These Data
objects contain both the image content and the necessary metadata for C2PA authentication.
When it is time to display the image on the screen or perform other visual operations, you can easily convert the Data
object back into a UIImage
. However, it is important to note that the original Data
object, enriched with C2PA metadata, remains the authoritative source. By following this approach, you can preserve the integrity and credibility of your content throughout your application.
LensSigner class
The LensSigner class handles all signing functions outside of the signing that happens automatically when capturing photos, videos, or audio. The Lens
class provides access to LensSigner
via the signer
public property. LensSigner
does not have a public initializer.
The sign
function takes in the media you want to sign (media
), optional custom assertions (customAssertions
), optional actions (actions
), and a completion closure (completion
). Once the signing operation is complete, it invokes the closure with a Result
that contains either the signed media or an error.
public func sign<T: Signable>(media: T,
customAssertions: [Assertion]?,
actions: [LensAction]?,
completion: @escaping (Result<T.SignableType, LensSigningError>) -> Void)
Parameters
media
media
- an object of type T
, a generic which conforms to the Signable
protocol. The two types available are:
SignableData
for imagesSignableURL
for videos and audio
customAssertions
customAssertions
are statements of fact about the piece of media you are signing. Two default assertions that Lens captures automatically are the user's current location and the current date and time, for example. Custom assertions are other pieces of information tailored for your needs. They are represented by the Assertion
struct. The method takes an array of Assertion
as input.
public struct Assertion: Codable {
let label: String
let value: String
}
An example of a custom assertion might be one that includes the results of a machine learning object detection pass on the image. Given the output of object detection in JSON:
{
"image_size" [
{
"width": 1024,
"height": 768,
"depth": 3
],
"annotations": [
{
"class_id": 0,
"score": 0.87,
"origin": {
"x": 109,
"y": 34
},
"size": {
"width": 84,
"height": 201
}
},
{
"class_id": 0
...
}
]
}
and a Codable
struct to store the output:
struct MLOutput: Codable {
var imageSize: ImageSize
var annotations: [Annotation]
}
struct ImageSize {
var width: Double
var height: Double
var depth: Int
}
struct Annotation {
var classID: Int
var score: Double
var origin: CGPoint
var size: CGSize
}
Convert to a string:
do {
let encoder = JSONEncoder()
encoder.outputFormatting = .prettyPrinted
let jsonData = try encoder.encode(mlOutput)
if let jsonString = String(data: jsonData, encoding: .utf8) {
print(jsonString)
} else {
print("Failed to convert JSON data to string.")
}
} catch {
print("Error encoding object: \(error)")
}
We can add this as a custom assertion:
let assertion: Assertion = Assertion(label: "annotations", value: jsonString)
self.signer?.sign(media: signableImage,
customAssertions: [assertion]
actions: [action] { result in
...
}
actions
actions
are a type of assertion that provide information on the edits or other actions that affect the media's content. The subject of Actions is covered in section 17.10 of the C2PA Spec. See LensAction.swift
for all actions supported by Lens. The sign
method takes an array of LensAction
as input. Examples of actions might be cropped
or filtered
. If no actions are appropriate, simply pass nil
for actions
in the sign
function.
completion handler
The completion handler is a Swift Result
type that returns signed media of the same type that was passed in to sign
if successful, or an error otherwise.
Signable protocol
public protocol Signable {
associatedtype SignableType
var value: SignableType { get }
var title: String { get }
var uuid: String { get }
}
For SignableData
, this is implemented as:
public struct SignableData: Signable {
public typealias SignableType = Data
public let value: Data
public let title: String
public let uuid: String
public let format: ImageFormat
public let ingredient: Data?
public init(value: Data, title: String, uuid: String, format: ImageFormat, ingredient: Data? = nil) {
self.value = value
self.title = title
self.uuid = uuid
self.format = format
self.ingredient = ingredient
}
}
You'll see that for images, the associated SignableType
is typealiased to Data
. For videos, we use SignableURL
which typealiases SignableType
to URL
.
let signableImage = SignableData(value: imageData,
title: filename,
uuid: uuid,
format: .jpg
ingredient: originalImage)
PARAMETERS
value
- the image data you wish to be signed.title
- the file name, with extension. It's common to use a UUID as the file name.uuid
- a SwiftUUID
string, a unique identifier for the signing process.format
- this example points to a jpeg image.ingredient
- the original image data from whichvalue
was derived.
Example
To put everything together, we'll use a delegate method from Lens Camera as the launching point and apply a Core Image Pixellate filter to the signed Truepic. Then we'll create an instance of SignableData
for the filtered image, which we'll then pass to the Signer's sign
method.
func lensCameraDidGenerateTruepic(_ imageName: String, _ truepic: Data) {
// Lens has passed us a signed image from the camera
// Convert the truepic data to UIImage for display/filtering
let image = UIImage(data: truepic)
// Pass the image to our filter routine
applyPixellateFilter(to: image) { filteredImage in
guard filteredImage != image else { return }
let uuid = UUID().uuidString
let filename = uuid + ".jpg"
// Create the singable image from the filtered image and truepic data
// Even though we've converted the original truepic into a UIImage,
// thus losing the signed metadata, we're including it as an "ingredient"
// to the filter action we just applied, so we'll still have complete
// content credential chain of information.
let signableImage = SignableData(value: filteredImage,
title: filename,
uuid: uuid,
format: .jpg,
ingredient: truepic)
// Define an action which will state what we did to the original
let appVersion = "Lens Demo" + (Bundle.main.getAppVersionNumber() ?? "")
let action = LensAction(action: .coreImageFiltered(filterName: "Pixellate",
when: Date(),
softwareAgent: appVersion)
// Dispatch the sign call to a background thread
DispatchQueue.global(qos: .userInitiated).async {
self.signer?.sign(media: signableImage,
customAssertions: nil,
actions: [action] { result in
switch result {
case .success(let signedData):
self.saveSignedImageToDisk(signedData,
filename: filename)
case .failure(let error):
print("error = \(error.localizedDescription)")
}
}
}
func applyPixellateFilter(to imageData: UIImage, completion: (() -> UIImage)) {
let context = CIContext()
// CIImage doesn't retain orientation metadata from UIImage, so
// convert the UIImage to a CGImage, then create a CIImage from
// the CGImage, preserving orientation.
guard let cgImage = image.cgImage else {
completion(image)
return
}
let ciImage = CIImage(cgImage: cgImage)
let filter = CIFilter.pixellate()
filter.setValue(64, forKey: kCIInputScaleKey)
filter.setValue(ciImage, forKey: kCIInputImageKey)
guard let outputImage = filter.outputImage,
let outputCGImage = context.createCGImage(outputImage, from: outputImage.extent) else {
completion(image)
return
}
let finalImage = UIImage(cgImage: outputCGImage, scale: image.scale, orientation: image.imageOrientation)
completion(finalImage)
}
Verify the file
Lens'sVerify
method is simple and powerful; it allows you to locally verify any C2PA manifest contained within an image, video or audio in your app without sending data to a server.
Simply instantiate LensVerifier
and call one of the two methods, depending on your input data type:
verify(data: someData)
for imagesverify(url: someURL)
for videos or audio
An NSDictionary
object will be returned with the key value pairs from the C2PA manifest.
Optionally, you can convert the dictionary to JSON:
import LensSDK
let verifier = LensVerifier()
let manifest = verifier.verify(data: imageData)
let json = jsonString(from: manifest)
func jsonString(from dictionary: Any) -> String? {
if !JSONSerialization.isValidJSONObject(dictionary) {
return "Invalid JSON"
}
do {
let data = try JSONSerialization.data(withJSONObject: dictionary, options: [])
let jsonString = String(data: data, encoding: .utf8)
return jsonString
}
catch{
return nil
}
}
Here's an example of an image with a derivative where a Core Image filter was applied. Note that the dictionary contains an array of two manifests — one for the original capture and another for the derivative work:
[
{
"URI": "com.truepic:urn:uuid:fa011f4d-9a86-41f6-aa01-97782782221e",
"assertions":
{
"c2pa.hash.data":
[
{
"instance_index": 0,
"status": "VALID"
}
],
"c2pa.thumbnail.claim.jpeg":
[
{
"instance_index": 0,
"status": "VALID",
"thumbnail_id": "b4e7cd41afc07dc2d3622c0dd88df11aa56f432870a06df04bdecd708c2e9bc1"
}
],
"com.truepic.custom.blur":
[
{
"data": 106.94015004088502,
"instance_index": 0,
"status": "VALID"
}
],
"com.truepic.custom.odometry":
[
{
"data":
{
"altitude":
[
{
"relative": 0.05282593,
"timestamp": "2023-10-20T20:31:49.007Z"
}
],
"attitude":
[
{
"azimuth": -0.0046563100000000003,
"pitch": 0.87019210999999996,
"roll": 0.025192490000000001,
"timestamp": "2023-10-20T20:31:51.119Z"
}
],
"geomagnetism":
[
{
"timestamp": "2023-10-20T20:31:46.618Z",
"x": 10.18864441,
"y": -45.22263718,
"z": -29.224988939999999
}
],
"gravity":
[
{
"timestamp": "2023-10-20T20:31:51.119Z",
"x": 0.15925381,
"y": -7.4967211999999996,
"z": -6.3201411800000002
}
],
"heading":
[
{
"timestamp": "2023-10-20T20:31:46.618Z",
"true": 258.7242431640625
}
],
"lens": "Back",
"pressure":
[
{
"timestamp": "2023-10-20T20:31:49.007Z",
"value": 97.181838990000003
}
],
"rotation_rate":
[
{
"timestamp": "2023-10-20T20:31:51.119Z",
"x": 0.00031377999999999999,
"y": 0.00060550999999999997,
"z": 0.010696280000000001
}
],
"user_acceleration":
[
{
"timestamp": "2023-10-20T20:31:51.119Z",
"x": 0.094381930000000003,
"y": 0.022473699999999999,
"z": 0.10030396
}
]
},
"instance_index": 0,
"status": "VALID"
}
],
"com.truepic.libc2pa":
[
{
"data":
{
"git_hash": "",
"lib_name": "Truepic C2PA C++ Library",
"lib_version": "3.1.36",
"target_spec_version": "1.3"
},
"instance_index": 0,
"status": "VALID"
}
],
"stds.exif":
[
{
"data":
{
"@context":
{
"exif": "http://ns.adobe.com/exi/1.0/"
},
"exif:DateTimeOriginal": "2023-10-20T20:31:51Z"
},
"instance_index": 0,
"status": "VALID",
"truepic_id": "date_time_original"
},
{
"data":
{
"@context":
{
"exif": "http://ns.adobe.com/exif/1.0/"
},
"exif:GPSAltitude": "260.3",
"exif:GPSHorizontalAccuracy": "12.5",
"exif:GPSLatitude": "39.223899695880",
"exif:GPSLongitude": "-84.243579636656",
"exif:GPSTimeStamp": "2023-10-20T20:31:46Z"
},
"instance_index": 1,
"status": "VALID",
"truepic_id": "gps_date_and_location"
},
{
"data":
{
"@context":
{
"tiff": "http://ns.adobe.com/tiff/1.0/"
},
"tiff:Make": "Apple",
"tiff:Model": "iPhone11,6"
},
"instance_index": 2,
"status": "VALID",
"truepic_id": "make_and_model"
}
]
},
"certificate":
{
"cert_der": "MIIDTzCCAjegAwIB... {remainder of certificate truncated}",
"issuer_name": "IOSClaimSigningCA-Staging",
"organization_name": "Demo Org",
"organization_unit_name": "Demo Org Unit",
"status": "WARNING",
"status_reason": "Certificate validation skipped; no cert chain provided",
"subject_name": "Truepic Lens SDK v1.5.0 in Lens Demo v1.5.0.249",
"valid_not_after": "2023-10-21T19:43:51Z",
"valid_not_before": "2023-10-20T19:43:52Z"
},
"certificate_chain":
[
{
"cert_der": "MIIEijCCAnKgAwIB... {remainder of certificate truncated}"
}
],
"claim":
{
"claim_generator": "Truepic_Lens_iOS/1.5.0 libc2pa/3.1.36",
"claim_generator_info":
[
{
"name": "Truepic Lens iOS",
"version": "1.5.0"
}
],
"dc:format": "image/jpeg",
"dc:title": "1344B3E9-3F38-4326-9497-C410B26D4891.jpg",
"instanceID": "1344B3E9-3F38-4326-9497-C410B26D4891"
},
"is_active": false,
"signature":
{
"signed_by": "Demo Org",
"signed_on": "2023-10-20T20:31:52Z",
"status": "VALID"
},
"trusted_timestamp":
{
"TSA": "Truepic Lens Time-Stamping Authority",
"status": "VALID",
"timestamp": "2023-10-20T20:31:52Z"
}
},
{
"URI": "com.truepic:urn:uuid:4f4715cb-a5f8-41e1-b885-59efe026f781",
"assertions":
{
"c2pa.actions":
[
{
"data":
{
"actions":
[
{
"action": "com.apple.core_image.Thermal",
"softwareAgent": "Lens Demo1.5.0",
"when": "2023-10-20T20:32:00.437Z"
}
]
},
"instance_index": 0,
"status": "VALID"
}
],
"c2pa.hash.data":
[
{
"instance_index": 0,
"status": "VALID"
}
],
"c2pa.ingredient":
[
{
"data":
{
"format": "image/jpeg",
"ingredient_manifest": "self#jumbf=/c2pa/com.truepic:urn:uuid:fa011f4d-9a86-41f6-aa01-97782782221e",
"instanceID": "1344B3E9-3F38-4326-9497-C410B26D4891",
"thumbnailID": "b4e7cd41afc07dc2d3622c0dd88df11aa56f432870a06df04bdecd708c2e9bc1",
"title": "1344B3E9-3F38-4326-9497-C410B26D4891.jpg"
},
"instance_index": 0,
"status": "VALID"
}
],
"c2pa.thumbnail.claim.jpeg":
[
{
"instance_index": 0,
"status": "VALID",
"thumbnail_id": "8fb72ffc673df216a54a4a0c2d8201a4a09b534891577e2dfdbd7255ba592a08"
}
],
"com.truepic.libc2pa":
[
{
"data":
{
"git_hash": "",
"lib_name": "Truepic C2PA C++ Library",
"lib_version": "3.1.36",
"target_spec_version": "1.3"
},
"instance_index": 0,
"status": "VALID"
}
],
"stds.exif":
[
{
"data":
{
"@context":
{
"exif": "http://ns.adobe.com/exif/1.0/"
},
"exif:DateTimeOriginal": "2023-10-20T20:31:59Z"
},
"instance_index": 0,
"status": "VALID",
"truepic_id": "date_time_original"
}
]
},
"certificate":
{
"cert_der": "MIIDTzCCAjegAwIB... {remainder of certificate truncated}",
"issuer_name": "IOSClaimSigningCA-Staging",
"organization_name": "Demo Org",
"organization_unit_name": "Demo Org Unit",
"status": "VALID",
"subject_name": "Truepic Lens SDK v1.5.0 in Lens Demo v1.5.0.249",
"valid_not_after": "2023-10-21T19:43:51Z",
"valid_not_before": "2023-10-20T19:43:52Z"
},
"certificate_chain":
[
{
"cert_der": "MIIEijCCAnKgAwIB... {remainder of certificate truncated}"
}
],
"claim":
{
"claim_generator": "Truepic_Lens_iOS/1.5.0 libc2pa/3.1.36",
"claim_generator_info":
[
{
"name": "Truepic Lens iOS",
"version": "1.5.0"
}
],
"dc:format": "image/jpeg",
"dc:title": "1344B3E9-3F38-4326-9497-C410B26D4891.jpg",
"instanceID": "1344B3E9-3F38-4326-9497-C410B26D4891"
},
"is_active": true,
"signature":
{
"signed_by": "Demo Org",
"signed_on": "2023-10-20T20:32:01Z",
"status": "VALID"
},
"trusted_timestamp":
{
"TSA": "Truepic Lens Time-Stamping Authority",
"status": "VALID",
"timestamp": "2023-10-20T20:32:01Z"
}
}
]
To learn more about different parts of the validation report, see Validate C2PA.
Parse the output
A future release will include helper functions to make parsing/ingesting the data faster. Here are some examples showing how to parse the dictionary, which can be used in your own Content Credentials display.
Get geo coordinates
if let exifs = assertions?["stds.exif"] as? [[String : Any]] {
for exif in exifs {
if let id = exif["truepic_id"] as? String, id == "gps_date_and_location", let data = exif["data"] as? [String : Any] {
let gpsLatitude = data["EXIF:GPSLatitude"] as? String ?? data["exif:GPSLatitude"] as? String
let gpsLongitude = data["EXIF:GPSLongitude"] as? String ?? data["exif:GPSLongitude"] as? String
if let latitude = gpsLatitude, let longitude = gpsLongitude {
self.longitude = longitude
self.latitude = latitude
}
}
}
}
Count the number of modifications
var modifications: Int = 0
let assertions = manifest["assertions"] as? [String : Any]
if let c2paActions = assertions?["c2pa.actions"] as? [NSDictionary] {
for c2paAction in c2paActions {
if let action = c2paAction as? [String : Any], let data = action["data"] as? [String : Any] {
for keyValue in data {
if keyValue.key.elementsEqual("actions"), let actionsList = keyValue.value as? [[String: Any]] {
for storedAction in actionsList {
if let actionType = storedAction["action"] as? String {
modifications += 1
}
}
}
}
}
}
}
Updated 15 days ago